Remote Sensing
Remote Sensing | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|
Type | Earth Science | |||||||||
Category | Study | |||||||||
Description | Participants will use remote sensing imagery, data and computational process skills to complete tasks related to climate change processes in the Earth system. | |||||||||
Event Information | ||||||||||
Latest Appearance | 2023 | |||||||||
Forum Threads | ||||||||||
| ||||||||||
Question Marathon Threads | ||||||||||
| ||||||||||
Division C Results | ||||||||||
|
In Remote Sensing, a Division C event, teams use remote sensing images, such as photographic and spectroscopic information, to analyze data and/or make climate models.
Each team may bring a binder of any size, as well as a metric ruler, a protractor, and any kind of (non-graphing) calculator.
Remote Sensing was most recently run nationally in the 2017, 2018, and 2022 seasons.
The tests tend to be comprised of a mix of image interpretation as well as questions regarding concepts of remote sensing and climate change processes (carbon cycle, aerosols, ozone depletion, etc.). Some ecology/biology background is useful, as well as meteorology and knowledge of basic physics concepts. Knowledge of individual space programs and NASA satellites, in addition to the types of sensors used, is recommended.
Please note that acronyms are used often in this event. Several times in this page, an acronym will be listed alongside a key term. They are there since acronyms can be used as questions. (i.e. what does RADAR stand for?) There are many of them, but do not get confused because of them. Keep in mind that they are symbols of the idea they represent, and not independent entities.
Topics
Remote Sensing rotates between topics occasionally. In addition, the specific aspects of remote sensing, including the satellites and types of data focused on, often change between years. The topic for the 2022 season was the same as it was in 2017 and 2018, Climate Change Processes.
Season | Topic |
---|---|
2022 | Climate Change Processes |
2018 | Climate Change Processes |
2017 | Climate Change Processes |
2013 | Earth's Hydrosphere |
2012 | Earth's Hydrosphere |
2011 | Human Impact on Earth |
2010 | Human Interactions with Forest Biomes |
2009 | Human Land Use Patterns? |
2008 | Mars |
2007 | Mars |
2006 | |
2005 | |
2004 | |
2003 | |
2002 |
Satellites
History
While not an integral part of the event, it is expected that a student know the basic history behind Remote Sensing.
The Beginning
Remote sensing began in the 1860s, when Gaspard-Felix "Nadar" Tournachon, a Frenchman, took aerial photographs from a balloon. In the 1880s, cameras were mounted on kites and took pictures via a remote control mechanism. When heavier-than-air flight was invented, it was only logical to take pictures from airplanes also. During WWI, cameras mounted on planes, or held by aviators were used in military reconnaissance.
Remote Sensing Advances
As early as 1904, rockets were used to launch cameras to heights of 600 meters. But until the 1960s, aerial photographs from airplanes were the only way to take pictures of Earth's landscape. It took the space race between the US and the USSR to start remote sensing from above the atmosphere.
The first satellite was Sputnik 1, launched by the USSR on 4 October 1957. Sputnik helped to identify the density of high atmospheric layers and provided data on radio signals in the ionosphere. In early 1955 the US was working on Project Orbiter, which used a Jupiter C rocket to launch a satellite. The project, led by legendary rocket scientist Wernher von Braun, succeeded, and Explorer 1 became the US’ first satellite on January 31, 1958.
Examples of Instruments
Instruments are instrumental (no pun intended) to the function of satellites and remote sensing. Know what types of instruments will be used for certain applications.
- RADAR: short for Radio Detection and Ranging. The name comes from WWII, when radio waves were actually used in radar. The waves are scattered and reflected when they come into contact with something. Modern-day radars actually use microwaves. They can pass through water droplets and are generally used with active remote sensing systems. Radar is good for locating objects and measuring elevation.
- LIDAR: short for Light Detection and Ranging. It is similar to RADAR but uses laser pulses and continuous-wave lasers instead of radio waves.
- TM: stands for Thematic Mapper. It was introduced in the Landsat program and involves seven image data bands that scan along a ground track.
- ETM+: stands for Enhanced Thematic Mapper Plus. It replaced the TM sensor on Landsat 7. Unlike TM, it has eight bands.
- MSS: stands for Multispectral Scanner. It was introduced in the Landsat program also, and each band responds to a different type of radiation, thus the name “multispectral”.
- MODIS: stands for Moderate-resolution Imaging Spectroradiometer. It is on the Terra and Aqua satellites. It measures cloud properties and radiative energy flux.
- CERES: stands for Clouds and the Earth's Radiant Energy System. It is on the Terra and Aqua satellites. It measures broadband radiative energy flux.
- SeaWiFS (Sea-viewing Wide Field-of-View Sensor): Eight spectral bands of very narrow wavelength ranges, monitors ocean primary production and phytoplankton processes, ocean influences on climate processes (heat storage and aerosol formation), and monitors the cycles of carbon, sulfur, and nitrogen.
Other Instruments
- ALI: Advanced Land Imager
- ASTER: Advanced Spaceborne Thermal Emission and Reflection Radiometer
- ATSR: Along Track Scanning Radiometer
- AVHRR: Advanced Very High Resolution Radiometer (used with NOAA)
- AVIRIS: Airborne Visual/Infrared Imaging Spectrometer
- CCD: Charge Coupled Devices
- CZCS: Coastal Zone Color Scanner
- GPS: Global Positioning System
- HRV: High Resolution Visible sensor
- LISS-III: Linear Imaging Self-Scanning Sensor
- MESSR: Multispectral Electronic Self-Scanning Radiometer
- MISR: Multi-angle Imaging Spectro Radiometer
- MSR: Microwave Scanning Radiometer
- RAR: Real Aperture Radar
- VTIR: Visible and Thermal Infrared Radiometer
- WiFS: Wide Field Sensor
This is in no way a comprehensive list of the instruments tested on in Remote Sensing. Participants are encouraged to at least demonstrate basic knowledge about how these instruments work.
For those who are interested, a large list of acronyms used by NASA can be found here
Active Sensing vs. Passive Sensing
Active sensing occurs when the satellite produces radiation on its own, and then senses the backscatter. This is useful since it does not depend on outside radiation, but it uses up energy more quickly. Examples of active sensors are a laser fluorosensor and synthetic aperture radar (SAR). These instruments operate even at night, because they do not rely on reflected radiation (which is usually solar in origin). Passive sensing, on the other hand, senses naturally available radiation to create a picture. It does not need to use energy to produce radiation, but it is dependent on the outside radiation's existence. If there is little or no outside radiation, the satellite cannot function well. One exception includes thermal infrared (TIR) sensors, which actually obtain better information at night, because of natural emission of thermal energy.
Examples of Satellites
There are countless numbers of satellites currently orbiting Earth, but tests will mostly focus on satellite programs directed by NASA (National Aeronautics and Space Administration). Some other agencies of note include the NOAA (National Oceanic and Atmospheric Administration), the CSA (Canadian Space Agency), JAXA (Japanese Aerospace Exploration Agency), the ESA (European Space Agency) and the IRS (Indian Remote Sensing).
EOS
Satellites especially likely to appear on tests are those that come from the Earth Observing System (EOS). The EOS is a series of NASA satellites designed to observe the Earth's land, atmosphere, biosphere, and hydrosphere. The first EOS satellite was launched in 1997.
A-Train, or EOS-PM: an EOS satellite constellation scheduled to be with seven satellites working together in Sun synchronous (SS) orbit. Their compiled images have resulted in high-resolution images of the Earth's surface.
The A-train, (also known as the Afternoon constellation because of its south-to-north equatorial crossing time of 1:30 PM) was planned to have seven satellites. Four satellites currently fly in the constellation.
- Active:
- OCO-2: studies global atmospheric carbon dioxide. It is a replacement for the failed OCO.
- GCOM-W1 "SHIZUKU": studies the water cycle.
- Aqua: studies the water cycle, such as precipitation and evaporation.
- Aura: studies Earth's ozone layer, air quality, and climate.
- Past:
- PARASOL: studied radiative and microphysical properties of clouds and air particles; moved to lower orbit and deactivated in 2013.
- CloudSat: studies the altitude and other properties of clouds; moved to lower orbit following partial mechanical failure in February 2018.
- CALIPSO: studies of clouds and air particles, and their effect on Earth's climate; moved to CloudSat's orbit in September 2018. CloudSat and CALIPSO now make up the C-train.
- Failed to achieve orbit:
- OCO (Orbiting Carbon Observatory): was intended to study atmospheric carbon dioxide.
- Glory: was to study radiative and microphysical properties of air particles.
Both failures occurred because of launch vehicle failure.
Morning Constellation, or EOS-AM, is a second constellation in the EOS. It is so named because of its 10:30 AM north-to-south equatorial crossing. Currently there are three satellites in this constellation - the fourth, EO-1, was decommissioned in March 2017.
- Landsat: A series of 9 satellites using multiple spectral bands. Three are operational today: Landsat 7, Landsat 8 (launched in February 2013), and Landsat 9 (launched in September 2021). Data from Landsat 9 will be available through USGS in early 2022. These are some of the most commonly tested satellites. The name Landsat is a mixture of the two words "land" and "satellite".
- Terra: Provides global data on the atmosphere, land, and water. Its scientific focus includes atmospheric composition, biogeochemistry, climate change and variability, and the water and energy cycle.
Remember that the aforementioned satellites hardly comprise an exhaustive list - the EOS is a very large collection of satellites. It is almost impossible to know all of them thoroughly, so it is suggested that participants familiarize themselves with the most important ones.
Other Satellites
There are other notable satellites that may appear on exams and are not affiliated with the EOS.
- GOES (Geostationary Operational Environmental Satellite) System: 2 weather satellites in Geostationary orbit 36000 km. It is partially organized by NASA, in cooperation with NOAA.
- MOS: Marine Observation Satellite
- SeaSat: SEA SATellite. This satellite is especially significant as the first satellite focused on the oceans, as well as the first satellite to carry synthetic aperture radar (SAR).
- SPOT: Système Pour l'Observation de la Terre. This is a series of 7 CNES satellites similar to the Landsat program, with a more commercial focus.
Satellite Imaging
The formal definition of remote sensing is the science of acquiring data without being in contact with it. For this reason, a major part of this event involves the processing of images and analyzing them to come to a conclusion.
Image Processing
Satellite data is sent from the satellite to the ground in a raw digital format. The smallest unit of data is represented by a binary number. This data will be strung together into a digital stream and applied to a single dot, or pixel (short for "picture element") which gets a value known as a Digital Number (DN). Each DN corresponds to a particular shade of gray that was detected by the satellite. These pixels, when arranged together in the correct order, form an image of the target where the varying shades of gray represent the varying energy levels detected on the target. The pixels are arranged into a matrix. This data is then either stored in the remote sensing platform temporarily, or transmitted to the ground. The data can then be manipulated mathematically for various reasons, which will be discussed below.
The human eye can only distinguish between about 16 shades of gray in an image, but it is able to distinguish between millions of colors. Thus, a common image enhancement technique is to assign specific DN values to specific colors, increasing the contrast. A true color image is one for which the colors have been assigned to DN values that represent the actual spectral range of the colors used in the image. A photograph is an example of a true color image. False color (FC) is a technique by which colors are assigned to spectral bands that do not equate to the spectral range of the selected color. This allows an analyst to highlight particular features of interest using a color scheme that makes the features stand out.
Additionally, remote sensing instruments often operate continuously for very long times, which means that they are prone to instrument error and malfunction. Different processing techniques can remedy this.
Composites
Composites are images where multiple individual satellite images have been combined to produce a new image. This process is used to create more detailed images that take multiple factors into account, as well as to find patterns that would not have been revealed in a single image. It also helps to create larger images than the satellite itself can make. This is because each satellite covers a specific swath, or area imaged by a satellite with a fixed width. When these swaths are put together into a composite, a larger area is imaged.
Traditionally, composites were made by merging colors. Three black and white transparencies of the same image are first made. Each represents a different spectral band - blue, green, and red. Shine white light through each one onto a screen. Then, each band is projected through a filter of the same color - blue band through a blue filter, green through a green, red through a red. Because the blue images are clear on the blue spectral band image, they'll appear blue on the composite. When the three images are aligned, the resulting image will have the natural color (or very close) of the original. This process is called "color additive viewing", and red, green, and blue are often known as additive colors because they add together to create new colors. The earliest color films would record multispectral scenes in multicolored films, which were then developed and merged into one colored film image.
In satellite data, this crude method is replaced by assigning color gradient values to DNs. When the three colorized images are merged, a true color image is formed.
Not all composites are true colored. A False Color Composite (FCC) results when one band is assigned color gradient values for another color, such as if the blue band DNs were set to correspond to shades of red. One extremely popular FCC combination assigns red colors to the NIR image. Since healthy vegetation reflects strongly in the NIR region of the EM spectrum, a FCC using this combination displays areas with healthy vegetation as red. This also differentiates natural features from artificial ones: a football field made up of healthy grass has a strong red color, but a football field composed of Astroturf or other artificial substances will show up as a duller red, or even dark brown.
Common composites:
- True-color composite- useful for interpreting man-made objects. Simply assign the red, green, and blue bands to the respective color for the image.
- Blue-near IR-mid IR, where blue channel uses visible blue, green uses near-infrared (so vegetation stays green), and mid-infrared is shown as red. Such images allow seeing the water depth, vegetation coverage, soil moisture content, and presence of fires, all in a single image.
- NearIR is usually assigned to red on the image; thus, vegetation often appears bright red in false color images, rather than green, because healthy vegetation reflects a lot of nearIR radiation. This can also be used in identifying urban/artificial areas.
Contrast
Contrast refers to the difference in relative brightness between an item and its surroundings as seen in the image. A particular feature is easily detected in an image when contrast between an item and its background are high. However, when the contrast is low, an item might go undetected in an image.
Resolution
Resolution is a property of an image that describes the level of detail that can be discerned from it. This is important, as images with higher resolution will have higher detail. There are several types of resolution that are important to remote sensing. One of these is spatial resolution, which is the smallest detail a sensor can detect. Since the smallest element in a satellite image is a pixel, spatial resolution describes the area on the Earth's surface represented by each pixel. For example, in a weather satellite image that has a resolution of 1 km, each pixel represents the brightness of an area that is 1 km by 1 km.
Other types of resolution include spectral resolution, or the ability of sensor to distinguish between fine wavelength intervals; radiometric resolution, which is the ability of sensor to discriminate very small differences in energy; and temporal resolution, or the time between which the same area is viewed twice.
A key thing to keep in mind is that the resolution of a particular satellite sensor must be optimized for the intended use of the data. Weather satellites generally monitor weather patterns that cover hundreds of miles, so there is no need for resolution higher than 0.5 km. Landsat and other land-use satellites need to distinguish between much smaller items, so a higher resolution is required. The trade-off for higher resolution, however, is that the amount of data produced by the satellite is much greater, which increases transmission times and burdens the mission. In addition, smaller areas contain less radiometric output, so spectral resolution generally decreases for increased spatial resolution.
Image Enhancement
Images often need to be enhanced to facilitate easier interpretation. These enhancements include, but are not limited to, contrast enhancement, haze/noise corrections, instrument corrections, and smoothing operations.
- Contrast enhancement: usually, the two extreme DN values are resampled to the minimum value, 0, and the maximum value that the sensor supports. All values in the middle are recalculated either by a line of best fit (linear contrast stretching) or by frequency of the values in the image (histogram stretching).
- Haze corrections reflect the fact that haze usually causes a uniform increase across all DNs in an area. The sensor is calibrated over a source with a known radiative output (such as a body of water, which would have a value approaching 0) and all DNs in the area are resampled by subtracting that value.
- Noise corrections involve the use of kernels, matrices of DNs surrounding a central DN. Weighted averages are determined and a threshold value for the difference between this average value and the average value is set. If the difference exceeds the threshold, the central DN is reassigned the average value; if the difference is less than than the threshold, the DN is conserved.
- Instrument corrections generally account for malfunctioning sensors on the CCD. Corrections are similar to those used in haze corrections.
- Smoothing operations simply create smoother images with less detail, which serves a variety of purposes.
Snell's Law
Snell's law describes how light refracts when it enters a new medium. Light travels slower through every non-vacuum medium as it gets absorbed, remitted, and scattered by particles as it travels through. The index of refraction of a medium can be calculated by dividing the speed of light in a vacuum over the speed of light in the medium. These are used in Snell's law (shown below). [math]\displaystyle{ θ_1 }[/math] is the angle for the incident radiation and [math]\displaystyle{ n_1 }[/math] is the refractive index for the medium the light is entering from. [math]\displaystyle{ θ_2 }[/math] is the angle the light will travel once it enters the new medium and [math]\displaystyle{ n_2 }[/math] is the refractive index of the new medium. The critical angle is the angle of incidence where the refractive angle is 90°. Any angle greater than the critical angle will be reflected. Common refractive indices include water (1.33), glass (1.52) and diamond (2.42). Air is rounded to 1
The Electromagnetic Spectrum
Electromagnetic radiation (EMR) is the most common energy source for remote sensing. It consists of an electric and magnetic field perpendicular to each other and the direction of travel while traveling at the speed of light. This is important to remote sensing because that's how sensors detect certain data about the objects a satellite is studying.
Radiation is an important part of remote sensing, since different materials respond to radiation in different ways, so this can be used to identify objects. One example of this is scattering (or atmospheric scattering), where particles in the atmosphere redirect radiation. There are three types: Raleigh, Mie, and non-selective. This scattering is used to identify the presence and quantity of certain gases in the atmosphere. Also, transmission is when radiation passes through a target, indicating it is unaffected by that particular wave.
There are several types of electromagnetic energy that can be emitted, depending on their wavelength. All of them are found in the electromagnetic spectrum (EMS).
It's important to know which types of energy are useful for what:
- Gamma rays and x-rays cannot be used for remote sensing because they are absorbed by the Earth's atmosphere: in general, the shorter the wavelength (and the greater the frequency), the more absorption occurs.
- Ultraviolet radiation is not useful either because it is blocked by the ozone layer.
- Visible light allows satellites to detect colors a human eye would see. Some of these satellites are panchromatic, meaning they are sensitive to all wavelengths of visible light.
- Near Infrared (NIR), the region just past the visible portion of the spectrum, is useful for monitoring vegetation, as healthy vegetation reflects much NIR.
- Short Wave Infrared (SWIR), a region somewhere past NIR, which is useful for determining the spectral signature of objects. A spectral signature is the telltale reflectance of radiation by a material across the spectrum. Each object has a unique spectral signature.
- Thermal Infrared (TIR), the kind of IR we perceive as "heat". The Earth naturally emits TIR, so TIR remote sensing usually involves passively detecting radiation in this region of the spectrum. This is useful for determining temperatures of objects.
- Microwaves are used in radar. Different radars utilize different wavelengths, which range from relatively shortwave regions, such as the W-band, to longwave regions such as the L-band. Radars are very good at penetrating foliage, and in the case of longwave radars, the ground, to a certain extent. Microwaves also reveal a lot about the properties of a surface, such as its dielectric constant, moisture, etc.
Since the atmosphere's components selectively absorb select wavelengths, only certain regions of the spectrum may actually be used in remote sensing. These regions are known as atmospheric windows.
Blackbody Radiation
In physics, a blackbody is an ideal object which absorbs and re-emits 100% of all incident radiation. The spectral signature for a blackbody is modeled by a blackbody curve, determined by the Planck Function. The blackbody curve is dependent on temperature. In practice, blackbodies do not exist; instead, most objects are graybodies, which emit a certain percentage of the radiation absorbed. This percentage is known as emissivity.
Integrating a Planck function blackbody curve gives the total radiant exitance, or power emitted per unit area, of an object. Radiant exitance is specified by the Stefan-Boltzmann Law. For more information on this law, see Reach for the Stars#Stefan-Boltzmann's Law and Climate Notes#Radiation equations.
The peak wavelength of an object is the wavelength at which the greatest energy is emitted from an object. This is the peak of the blackbody curve. The peak wavelength of an object can be determined by Wein's Displacement Law.
The dominant wavelength of an object is the wavelength which matches the perceived hue of the object. The sun appears yellow (when affected by Earth's atmosphere; it is actually white) because its dominant wavelength is in the yellow portion of the visible light region of the EM spectrum.
Albedo
Albedo is, simply put, the percentage of radiation that is reflected off of an object. Albedo and emissivity are important concepts to understand and differentiate. Whereas albedo concerns incident light that is reflected from a surface, emissivity concerns blackbody radiation that is emitted from an object. In real life applications, white objects, which reflect more wavelengths, have higher albedo. This reduces the amount of radiation that is absorbed, which also reduces the amount of radiation emitted by proxy.
Image Interpretation
For the basics of image interpretation, see Road Scholar#Satellite Images.
Image interpretation and analysis is a huge part of the Remote Sensing event. It involves locating, identifying or measuring certain objects in images acquired using Remote Sensing. This isn't as straightforward as it may seem. There are plenty of features that can throw you off in each image. However, some features are the same in each image as well. There will always be a "target" to look for, which will always contrast with other parts of the image- making it "distinguishable".
According to the Canada Centre for Remote Sensing, whose tutorial you can find in the external links section, there are several things to look for to assist in image interpretation. These are tone, shape, size, pattern, texture, shadow, and association.
- Tone is the brightness or color of an object. It's the main way to distinguish targets from backgrounds.
- Shape is the shape of an object. Note that a straight-edged shape is usually man-made, such as agricultural or urban structures. Irregular-edged shapes are usually formed naturally.
- Size, relative or absolute, can be determined by finding common objects in images, such as trees or roads. (see Finding Area section, below)
- Pattern refers to the arrangement of objects in an image, such as the spacing between buildings in an urban setting.
- Texture is the arrangement of tone variation throughout the image.
- Shadow can help determine size and distinguish objects.
- Association refers to things that are associated with one another in photographs, which can assist interpretation, i.e. boats on a lake, etc.
Finding Area
Another major part of image interpretation is determining the surface area of a particular area of interest. You will often be asked on a test to find the area of some piece of land, but this piece of land is usually not regularly shaped, like a rectangle. It'll have lots of different curves, and at first, it may seem difficult to find the exact area. However, an easy way to estimate area is to split up this irregular shape into smaller, easier shapes, like rectangles or circles. Then, you can add up the areas of the individual shapes to get the total area of the piece of land.
Before doing this, though, you need to take scale into consideration. Scale is the ratio of size on image to real-life size. For example, if the scale on an image is 1 inch:25 miles, each inch on the image represents 25 miles in real life. To find the area of one of your shapes, measure its dimensions with your ruler in inches (or centimeters, if the scale says so) and then multiply that number by the scale to find how long each of your dimensions is in real life. Do this for all of your smaller, more regular shapes. Then, just find the areas of all of them and add them together. Your answer should be approximately the area of the piece of land. It will not be exact, nor will it need to be, as test graders should have a range of values that they will accept as being correct.
NDVI
During the competition, you may be asked to analyze a picture's NDVI values. NDVI stands for "Normalized Difference Vegetation Index" and is used to describe various land types, usually to determine whether or not the image contains vegetation. The equation provided by USGS for NDVI is as follows:
Channel 1 is in the red light part of the electromagnetic spectrum. In this region, the chlorophyll absorbs much of the incoming sunlight. Channel 2 is in the Near Infrared part of the spectrum, where the plant's mesophyll leaf structure can cause reflectance. You may also see the equation given like so:
(Where NIR is Near Infrared and VIS is Visual (Red) Light)
So, healthy vegetation has a low red light reflectance (Channel 1) and a high infrared reflectance (Channel 2). This would produce a high NDVI value. As the amount of vegetation decreases, so too do the NDVI values. The range of NDVI values is -1 to +1.
Generally, areas rich in vegetation will have higher positive values. Soil tends to cause NDVI values somewhat lower than vegetation, small positive amounts. Bodies of water, such as lakes or oceans, will have even lower positive (or, in some cases, high negative) values.
There are some factors that may affect NDVI values. Atmospheric conditions can have an effect on NDVI, as well as the water content of soil. Clouds sometimes produce NDVI values of their own, but if they aren't thick enough to do so, they may throw off measurements considerably.
EVI
EVI, or the Enhanced Vegetation Index, was created to improve off of NDVI and eliminate some of its errors. It has an improved sensitivity to regions high in biomass and its elimination of canopy background. The equation for EVI is as follows:
Where NIR is again Near Infrared, and Red and Blue are of course those colors' bands. All three of these are at least partially atmospherically-corrected surface reflectances. The equation filters out canopy noise through L. C1 and C2 are the coefficients of aerosol resistance and G is the gain-factor. The coefficients used by the MODIS-EVI program are the following: L=1, C1 = 6, C2 = 7.5, and G = 2.5. It has a valid range from -10,000 to 10,000
EVI has been adopted as a standard product for two of NASA's MODIS satellites, Terra and Aqua. As it factors out background noise, it's often considered to be more popular than NDVI.
Climate Change Processes
The 2022 Remote Sensing topic is Climate Change Processes. Participants are encouraged to study climatology aspects as well as technology.
Human Interaction
Human Interaction with the Earth is a large part of Remote Sensing. It emphasizes how humans affect the Earth on scales detectable by remote sensing. This interaction has previously been represented in tests as deforestation, ozone layer changes, changes in land use, retreat of glaciers, and loss of sea ice, among others. It is important to remember, however, that not all climate change processes are anthropogenic, and other climatology events may be tested as well.
Global Warming
When human impact on the environment is mentioned, one of the main ideas it entails is global warming. Global warming is defined as “the increase in the average temperature of Earth's near-surface air and oceans since the mid-20th century and its projected continuation”. The causes of global warming are debated, but the main consensus is that the main cause is the increase in concentrations of greenhouse gases. Possible results of global warming include a rise in sea levels, a change in weather patterns, the retreat of glaciers and sea ice, species extinctions, and an increased frequency of extreme weather.
The greenhouse effect is caused by certain greenhouse gases that trap heat in the Earth’s atmosphere. The main gases, along with their percent contribution to the greenhouse effect, are water vapor (36-70%), carbon dioxide (9-26%), methane (4-9%), and ozone (3-7%). Of these, carbon dioxide is perhaps the gas most focused on as a potential human impact on the environment; thus, it is the most likely to appear on tests. Humans have increased the amounts of these and other greenhouse gases in the atmosphere during industrialization periods such as the Industrial Revolution. CFC’s and nitrous oxide are among the greenhouse gases now present in the atmosphere that were not before.
Radiative Forcing
Different greenhouse gases are more/less powerful in forcing the greenhouse effect. Their ability to force the greenhouse effect is known as radiative forcing and is generally given as a value or a percent.
Global Dimming
Global dimming refers to a natural process that cools the earth. Generally speaking, global dimming is caused not by gases but by aerosols, such as sulfates and chlorates. These aerosols occur naturally as well as artificially. For example, large-scale volcanic eruptions emit a lot of sulfate into the atmosphere. The 1991 eruption of Mt. Pinatubo dropped the average global temperature by 0.5 degrees C due to the global dimming effect. This is because these aerosols reflect incoming solar radiation very effectively.
Furthermore, sulfates and chlorate aerosols in the atmosphere serve as cloud condensation nuclei, or CCNs. CCNs are particles upon which water vapor can more favorably condense, forming clouds. These clouds reflect significantly more incoming radiation than normal clouds due to the Twomey effect. This results in cooling.
Chlorates, and to some degree sulfates, deplete ozone. In addition, sulfates react with water to form sulfuric acid, which precipitates as acid rain.
Climate Cycles
A number of cycles are integral to the study of climatology. The two most important cycles are the carbon cycle and the hydrological (water) cycle.
Carbon Cycle
The carbon cycle is the process through which carbon atoms are cycled through the environment. It cycles through the atmosphere as carbon dioxide, and some carbon is dissolved into the hydrosphere. It is also taken in by plants during photosynthesis and released when the plants die. When animals feed on plants, they also take in carbon.
However, the burning of fossil fuels, which come from biomatter, releases excess carbon into the atmosphere, increasing the concentration of carbon dioxide. Carbon can be stored for long periods of time in trees and soil in forest biomes, so the altering of this balance affects the cycle of carbon and can help global warming and climate change. The resulting global warming can then affect plant growth since slight changes in temperature or other biotic factors can kill off certain species of plants. Since there would be less plants remaining alive, more carbon dioxide stays in the atmosphere rather than being taken in by plants.
The carbon cycle indicates the presence of what are known as carbon sources and sinks. A source emits carbon into the atmosphere or the biosphere, and a sink absorbs it from the atmosphere, storing that carbon somewhere inaccessible. Oceans, for example, are carbon sinks. In addition, phytoplankton in the oceans photosynthesize and utilize dissolved carbon dioxide to produce energy. Plants also absorb carbon; tropical rainforests, and other biomes containing a very rich and diverse population of flora, are also carbon sinks. However, deciduous plants shed leaves and suspend photosynthesis in the winter, leading to the seasonal fluctuations observed in the Keeling Curve.
Natural sources of carbon include wildfires and volcanic eruptions. However, the sources of carbon that are of most interest to scientists are usually anthropogenic - the burning of fossil fuels such as coal or petroleum, for example.
Hydrological Cycle
The hydrologic cycle, more commonly known as the water cycle, describes the cycle through which water travels. Its base is the more commonly known cycle of evaporation, condensation, and precipitation. Among smaller parts of the water cycle, water is stored as ice and snow in cold climates. It also enters the ground through infiltration, although some simply flows over it as surface runoff. The groundwater flow then takes this water to the oceans where it reenters the main cycle.
Finally, some evaporation occurs as evapotranspiration in plants. Fewer plants would result in less carbon taken in, and thus more carbon dioxide in the atmosphere contributing to the greenhouse effect.
Water vapor is the most prevalent greenhouse gas, and so monitoring all aspects of the water cycle also provides valuable insight and data concerning climate change and the validity of the greenhouse effect.
Key Climate Concepts
The Ozone Layer
The ozone layer is a naturally occurring layer of ozone in the stratosphere. Ozone blocks harmful UV from reaching the surface. Ozone can interact with various gases in the atmosphere, some natural, others artificial. These gases may destroy ozone, leading to ozone depletion. Ozone depletion may be seasonal or anthropogenic.
Ozone is formed from the reaction of a free oxygen atom with a molecule of oxygen. The resulting molecule is relatively unstable. When a photon carrying sufficient energy, such as that of UV, hits the molecule, the molecule will absorb the energy and break back into its reactants. Therefore, ozone shields the surface of the Earth from ionizing radiation, which would otherwise be very detrimental to life.
Ozone Depletion
Ozone depletion has been greatly accelerated by pollutants in the atmosphere. Before the "ozone hole" was discovered, many propellants and refrigerants used chlorofluorocarbons (CFCs). In the atmosphere, chlorine atoms will break off from the CFC molecule. Chlorine is a very efficient catalyst in the breakdown of ozone - one molecule of chlorine can degrade up to ten thousand molecules of ozone. CFCs were phased out since the Montreal Protocol went into effect in 1989. It is estimated that recovery may last until the mid-21st century.
CFCs have been replaced with hydrofluorocarbons (HFCs), which do not contain chlorine, and therefore does not deplete ozone.
Ocean Acidification
Previously, it was mentioned that oceans dissolve carbon dioxide. This is not the entire explanation. In reality, carbon dioxide reacts with water to form carbonic acid, which can dissociate into carbonate and bicarbonate anions.
Carbon dioxide, carbonic acid, carbonate and bicarbonate exist in a temperature-dependent equilibrium. The higher the temperature, the more the equilibrium shifts towards the production of carbonate and bicarbonate. The H+ cations dissociate and accumulate, lowering the pH of the water. This process is known as ocean acidification.
Ocean acidification is detrimental to marine life. Organisms particularly sensitive to pH changes include coral and phytoplankton. Corals support the most diverse marine ecosystems, the ocean, and phytoplankton form the staple of many food chains. Coral is made of a calcium carbonate (aragonite/calcite) skeleton, which dissolves in acidic water. In addition, corals produce a natural sunscreen that protects them from ionizing radiation, a process that is slowed in warmer environments. The biodiversity of many coral reefs, including the Great Barrier Reef off Australia, is being threatened by what is known as coral bleaching (dead coral appears bleached).
Scientists measure these processes either by directly measuring the sea surface temperature (SST) using thermal infrared remote sensing, or by measuring chlorophyll concentrations and phytoplankton presence/health by proxy. MODIS, for example, has bands dedicated to measuring chlorophyll concentrations.
Sample Questions
- The ______________ ______________ is a naturally occurring process that aids in heating the Earth's surface and atmosphere.
- The term _________ is used to describe the total mass of organic matter.
- _______ _______ refers to any fuel that is created from decomposed carbon-based plant and animal organisms.
- A _____________ is the smallest element that can be displayed on a satellite image or computer monitor.
- Define Albedo.
- Earth is in a __________ orbit.
- The geometric shape of satellite orbital paths around the Earth is called what?
- Define Eccentricity.
- Define Obliquity.
- Define Precession.
- What is the name of the scientist who first proposed the astronomical theory for climate change?
- According to the theory, how might changes in the eccentricity, obliquity, and precession result in an ice age?
Resources
Textbooks
Links
- Official Science Olympiad remote sensing page
- This remote sensing tutorial written by the Canada Centre for Remote Sensing is very useful. Covers basic concept of remote sensing, sensor types, image interpretation and analysis, and use of data.
- The NASA tutorialarchived from original is more advanced than the Canada one, and it is recommended reading after the Canada one has already been read. Difficult to read both due to time constraints, however, most substance in this tutorial will not be necessary on most tests. Good if time permits.
- This is good for a very brief overview of the topic of remote sensing
- This is a good source for all of the bands of the major satellites
Older links
NOTE: These links are not relevant to the 2022 event
- OR http://mars.jpl.nasa.gov Mars Topographic Map, as referenced by the official rules. No longer applicable due to rule changes.
- File 1 and File 2
- Direct links to the Mars Topographic Maps from pubs.usgs.gov - note they are large in file size. No longer applicable due to rule changes.