Basics of EMR/Atmospheric Affects Foundations of Remote Sensing
|The Electromagnetic Spectrum
USGS defines the electromagnetic spectrum
in the following manner:
Electromagnetic waves may be classified
by frequency or wavelength.
Wave Phenomena Concepts
Electromagnetic waves are generated by hydrogen fusion taking place in the sun. When the electromagnetic energy encounters anything, even a very tiny object like a molecule of air or water, one of three reactions will occur. The radiation will either be reflected off the object, be absorbed by it or it could be transmitted through because the energy manifested by electromagnetic waves can behave both like a particle (the proton) and like a wave (the oscillating electric/magnetic fields).
The total amount of radiation that strikes
an object is referred to as the "incident radiation"
In earth science remote sensing, we are largely concerned with reflected radiation (reflected sunlight). Scientists and other users of remotely sensed imagery (photography or satellite images) make use of the wavelength information by way of what are called "spectral signatures". With the interpretation of aerial photography we will be developing our senses of pattern recognition and identification as well as our understanding of how reflected sunlight behaves. The basic idea is that reflected light consists of a continuum of wavelengths and that certain wavelengths will respond differently depending on what is reflecting them. What do wet golf courses in the desert look like in color infrared wavelengths? understanding the principle behind the answer to this question will provide you with a means of gathering information.
Sun light that gets reflected by our surroundings is senses by our eyes, the different wavelengths allow us to perceive colors. The lenses of our eyes focus the light onto the backs of our eye balls, that consists of specialized tissue composed of what are termed cones and robs, the different visible wavelengths of electromagnetic energy are converted to an electrical signal which our brain processes and assembles into a three dimensional color image of the world in front of us in real time. Camera film works in a similar fashion, as do digital cameras and satellite sensor systems but the capabilities of the processors are different, and only offer an instantaneous snap shot of the world. Pattern recognition for example is not something computers do well when given an extremely complex image. Computers and digital remote sensing systems will never be able to replace people with regard to photo interpretation and feature identification. Although the technological capabilities and the sensor system capabilities are now becoming available that mimic the capabilities of our eyes and brain these systems will never even compare to the pattern recognition and interpreting capabilities we all have naturally.
Optical remote sensing systems, our eyes included, are generally called "passive systems" because they only detect incident electromagnetic radiation that has been reflected and transmitted back up through the atmosphere. Digital Thermal Infrared (Thermal IR) sensors do not detect reflected radiation, they are tuned to respond to emitted heat energy (thermal wavelengths). Photography's Color Infrared (CIR) uses special film, lenses and filters to capture reflected near infrared energy (sometimes called Near IR), NOT emitted thermal infrared (heat). Common misconceptions about "Infrared Photography" are that even though EMR (Electro Magnetic Radiation) in those wavelengths are sometimes labeled as part of the "visible" spectrum, these wavelengths cannot be sensed by our eyes and are NOT IN ANY WAY associated with heat. Standard filament light bulbs, like a glowing white hot metal rod, emit thermal infrared wavelengths and are a good source of radiate heat (Thermal Infrared) but not a very efficient source of visible light (the percentage of visible white light emitted by a filament light bulb, based its temperature, is very small compared to the percentage that is given off as invisible thermal energy that does not help us see). Color Infrared (CIR) photographs result from film that has reacted to sunlight that has been REFLECTED from the surface, NOT emitted (that's Thermal IR).
The electromagnetic spectrum is a continuum of electromagnetic wavelengths. All matter radiate electromagnetic energy of varying wavelengths according to their temperature. Absolute zero (0 degrees kelvin) is a theoretical temperature that describes no molecular motion, matter that does not radiate (called a theoretical "black body") cannot exist, most things are "gray bodies".
The wave-like behavior of electromagnetic
radiation is responsible for the familiar "refraction" that is responsible
for rainbows (because water droplets are spherical prisms) and sunsets
(which result from scattering of the light by particles). The particle-like
behavior is responsible for the exchange of energy. (See Jensen, Remote
Sensing of the Environment: An Earth Resource Perspective, Second Edition,
Aerial Photography Filtration pages 97-98).
The micron is the most commonly used unit for measuring the wavelength of electromagnetic waves. The spectrum of waves is divided into sections based on wavelength. The shortest and most energetic (high frequency) waves are called gamma rays. The longest waves and lowest energy (low frequency) waves are called radio waves, which are measured in meters or kilometers. The range of visible light that is sensible by our eyes consists of the narrow portion of the spectrum, from 0.4 microns (blue) to 0.7 microns (red). The photographic range of wavelengths extends into the near infrared and can be used to capture ultraviolet photography with highly specialized lenses and film.
In 1903 or 1904 the first reliable black and white infrared film was developed in Germany. The film emulsion was adjusted slightly from regular film to be sensitive to wavelengths of energy just slightly longer than red light and just beyond the range of the human eye. By the 1930s, black and white CIR films were being used for landform studies, and from 1930 to 1932 the National Geographic Society sponsored a series of CIR photographs taken from hot air balloons.
Throughout the 1930s and 1940s the military was hard at work developing color infrared film for surveillance. By the early 1940's a film was developed that was able to distinguish camouflaged equipment and non natural material from surrounding vegetation. CIR fooling paint/fabric was developed for use on military vehicles and troops, effectively making CIR film technology useless militarily.
Color infrared film is often called "false-color" film. Objects that are normally red appear green, green objects (except vegetation) appear blue, and "infrared" objects, which normally are not seen at all, appear red. (see additive color) The quality of the film/camera, time of year, climatic conditions and how the film is actually developed have an effect on how landscapes appear in CIR photos. The primary use of color infrared film is for studies involving vegetation such as wetlands mapping or ecosystem monitoring. Healthy green vegetation is a very strong reflector of infrared radiation and usually appears bright red on color infrared photographs (depending on how the film is developed).