Вы находитесь на странице: 1из 37

What is Imaging Spectroscopy (Hyperspectral Imaging)?

The main objective of imaging spectroscopy (also known as hyperspectral imaging in


the industrial and military communities) is to measure the spectral signatures and/or
chemical composition of all features within the sensor's field of view. Hyperspectral
data contains both spatial and spectral information from materials within a given
scene. Each pixel across a sequence of continuous, narrow spectral bands, contains
both spatial and spectral properties. Pixels are sampled across many narrowband
images at a particular spatial location within the "spectral cube", resulting in a one-
dimensional spectrum. The spectrum is a plot of wavelength versus radiance or
reflectance. The spectrum can be used to identify and characterize a particular feature
within the scene, based on unique spectral signatures or "fingerprints". Spectral data
can be obtained using either space-based or airborne platforms, and typically involves
scanning many narrowband images simultaneously, while using some type of
dispersion grating to produce the spectrum.
Hyperspectral imaging in the emissive region of the electromagnetic spectrum
typically involves the portion of the electromagnetic spectrum associated with
primarily vibrational motion of molecules, and to some degree rotational and
vibrational-rotational modes. These modes of molecular vibrations occur in the mid-
infrared (3 to 5 microns) and the longwave-infrared (8-14 microns). The mid-IR and
longwave-IR are sometimes referred to as the "fingerprint" region of the
electromagnetic spectrum, since many effluents and gases have distinctive absorption
features used in their identification. The analysis of effluents and gases between 3 to 5
microns can be problematic since both reflective and emissive properties are involved.
The region of the EM spectrum associated with electronic transitions of molecules is
the visible to near-IR, and also the shortwave-IR (SWIR). Analysis of hyperspectral
data across the visible, near-IR and SWIR portion of the spectrum deal with the
reflective nature of solids and liquid materials.

The reflective region of the spectrum ranges from 0.38 to about 3 microns, where the
shorter wavelength is limited by Earth's atmospheric cutoff in the near-UV. The
emissive region ranges from 7 to 15 microns when applied to hyperspectral imaging.
LWIR sensors are expensive to build, so the are very few hyperspectral sensors that
operated over the wavelength regime. The wavelength range from 3 to 5 microns
containes a mixed contribution from both reflective and emissive radiation, making
this region of the spectrum difficult to analyze. The mixed region from 3 to 5 microns
(MWIR) is very useful for identifying specific gases. The gases have many unique
absorption features in their spectral signatures, making identification possible. The
LWIR region (8 to 14.5 microns) is also used to identify various gases.

Note the complex mixture of both reflective and emissive properties between about 3
and 7 microns (MWIR region), making this region of the EM spectrum difficult to
work in. The Earth has a peak emission near 10 microns corresponding to a
temperature of approximately 289 K (from Wien's Displacement Law). The Earth's
peak reflectance is centered near 0.5 microns (for the Sun's effective temperature of
5780 K). It is interesting to note that this wavelength of peak emission is optimal for
the visual acuity of humans.
One advantage of working in the LWIR is that there are no problems associated with
solar illumination as encountered in the VNIR/SWIR portion of the spectrum, where it
is the reflective properties of the target that dominate. Imaging spectrometers
operating the the LWIR can also be used at night. In both the MWIR and LWIR, it is
the emissive properties of materials that dominate their nature. Different materials
within a given scene can be identified and characterized based on their unique
emissivity, a measure of how efficiently a particular material radiates energy in
comparison with a blackbody (i.e. a perfect emitter and absorber) at the same
temperature. The emissivity of a material depends on the wavelength and the
molecular properties of the material, and it is the unique emissive signature of the
object that can be used to identify it when working in the Mid-IR or Longwave-IR
portion of the spectrum.

The difference between multispectral and hyperspectral imaging is illustrated in the


diagram shown below. Broadband sensors typically produce panchromatic images
with very wide bandwidths, typically 400 nanometers. WorldView-1, for example,
produced broadband (panchromatic) images with a high spatial resolution of 50
centimeters. The bandwidth of WorldView-1's panchromatic images is 500
nanometers, ranging from 400 to 900 nanometers. Multispectral imaging involves
taking imagery over several discrete spectral bands of moderate bandwidth. Most
multispectral imagers have four basic spectral bands; blue, green, red, and near-IR.
Some multispectral imaging satellites, such as Landsat 7 have additional spectral
bands in the infrared region of the spectrum. Hyperspectral imaging systems can
obtain imagery over hundreds of narrow, continuous spectral bands with typical
bandwidths of 10 nanometers or less. For example, the AVIRIS airborne
hyperspectral imaging sensor obtains spectral data over 224 continuous channels, each
with a bandwidth of 10 nm over a spectral range from 400 to 2500 nanometers. An
example of an operational space-based hyperspectral imaging platform, is the Air
Force Research Lab's TacSat-3/ARTEMIS sensor, which has 400 continuous spectral
channels, each with a bandwidth of 5 nm. Ultraspectral sensors represent the future of
hyperspectral imaging technology. These sensors are defined to have 1000s of spectral
channels, each with a bandwidth narrower than those of hyperspectral sensors (<0.1
nm) (Kudenov, et al., 2015. Micro- and Nanotechnology Sensors, Systems, and
Applications VII, Proc. SPIE 9467).
Panchromatic imagery is good for detecting various objects, materials and activities
within a given area. Multispectral imagery can provide information on broad classes
of scene features, such as the presence of healthy vegetation, bodies of water - i.e.
multispectral data allows an analyst to separate scene features into generic classes, as
well as into features that have similar spectral properties. Hyperspectral imaging
allows much finer sampling of the spectrum of scene features. It can be used to
identify, and in some cases characterize scene features based on their unique spectral
signatures (absorption bands or emissive features). Ultraspectral (in addition to some
advanced hyperspectral sensors with especially narrow spectral bandwidths) sensors
will allow a quantitative assessment of scene materials (solids, liquids and gases). For
example, the abundance of different gases or effluents could be determined based on
the width and strength of absorption features in a given spectrum.

The spectra shown below illustrates an example of an observed scene feature, in this
case, the mineral Alunite - a sulfate mineral found in volcanic rocks, typically formed
in acid-sulfate hydrothermal-vein systems (e.g. Yellowstone National Park). The three
spectral clearly show the advantage of hyperspectral imaging over multispectral
sensors (MODIS and TM). The laboratory collected spectrum with a spectral
resolution typical of hyperspectral imaging systems has a much finer sampling than
the two spectra collected with multispectral sensors. The high-resolution spectral
shows many unique features (absorption bands) that are not seen in the multispectral
data (top two spectra). The multispectral sensors would not be able to identify the
mineral Alunite based on its distinctive absorption doublet near 1.45 microns.

Sub-pixel detection allows one to quantitatively determine fractional abundances of


materials within a single pixel using advanced mathematical techniques (described
later in the presentation). This allows analysts to detect particular materials that
occupy less than one pixel (illustrated by the yellow color), provided there is high
spectral contrast between the target material and its background. A significant
advantage of hyperspectral imaging over most other methods of remote sensing is that
a sensor does not have to resolve a given target to obtain information on the target
itself.

The diagram below shows the basic scenarios leading to the formation of absorption
features, emission features and no features in spectra. A warm gas between the sensor
and a cold background will result in a spectrum with various emission lines. The
reverse is true when a gas plume/cloud is located between the sensor and a warm
background - i.e. absorption features are produced. It is these unqiue emission and
absorption lines that are used to identify the particular gas. If there is no temperature
difference between the gas plume/cloud and background then no emission or
aborption features will be seen in the spectrum, making identification of the gas not
possible.

There exist many applications of hyperspectral imaging over different regions of the
electromagnetic spectrum, ranging from the VNIR to LWIR. For example, the
military typically designs various camouflage to mimic the spectral signature of
vegetation. This allows the material to blend in with background vegetation when
viewed using spectral sensors having near-IR bands. Both background vegetation and
the material show the unique "red edge" feature of vegetation. However, hyperspectral
sensors using both Near-IR and SWIR spectral bands can be used to discriminate
camouflage material from background vegetation, since there are significant
differences in spectral signatures between camouflage and vegetation in the
Shortwave-IR (SWIR), due to differences in the moisture content of background
vegetation, for example.
A typical workflow for the analysis of hyperspectral data is summarized in the
diagram below. Typically, one corrects raw hyperspectral "data cubes" for
atmospheric effects. This converts radiance to reflectance so that observed spectra can
be compared to library reference spectra. Atmospheric correction is the most critical
processing step in hyperspectral data analysis. A bad atmospheric correction will
result in false-positives when the data is analysed using various techniques. One of the
first steps to analysing spectral cubes is to separate noisy spectral bands from the data,
and to eliminate highly redundant spectral bands typical of hyperspectral data. This is
done using what is known as a Minimum Noise Fraction transform, which is
essentially two cascaded Principle Component transforms, the second transform being
performed on noise whitened data. The MNF transform essentially reduces the
dimensionality of the hyperspectral data, facilitating faster processing by computers.
Further information about PC and MNF transforms are available in many books and
papers covering the processing of hypespectral imaging. Once the data has been
reduced to the most important/relevant spectral bands, one typically derives spectral
"endmembers" from the data. An endmember is a term used to describe a pure spectral
signature of a particular material. These endmembers can be derived from the data
itself, or from spectral libraries and field collected spectra. After the endmembers are
collected from the data itself or library spectra, they are used by various spectral
mapping methods. The main goal of these various spectral mapping methods is to
produce a final product, such as a scene classification or thematic map, a material
identification map, or target detection map.
One of the most commonly used spectral mapping methods is Spectral Angle Mapper
or SAM. This methods simply treats each spectrum as a vector in an n-dimensional
scatter plot (or n-D space). The mathematical technique computes an angle between
the reference and observed spectrum. The smaller the angular separation, the closer
the match between the observed and reference spectra.
More advanced spectral mapping methods include Matched Filters and Adaptive
Coherence Estimators. Due to time constraints, the specific mathematics will not be
discussed here, but for reference the actual equations used by these advanced methods
are shown for reference. Essentially, matched filters and similar techniques try to
maximize the response of the target spectrum, while supressing background clutter.
The Adaptive Coherence Estimator or ACE models the background clutter using the
data's statistics (covariance matrix). ACE is commonly used as a target finding
technique since one does not have to have knowledge of all the endmembers within a
given scene, and because the method does not depend on the relative scaling of input
spectra. Some hyperspectral analysis tools improve upon the conventional Matched
Filter by incorporating an "infeasibility parameter" that describes how likely a "false
positive" is (e.g. ENVI's Mixture Tuned Matched Filter). These advanced matched
filters essentially combine the benefits of both conventional matched filter techniques
and linear mixture theory. This makes the Mixture Tuned Matched Filter especially
useful for sub-pixel analysis of scene materials.
Linear Unmixing (appplied to areal mixtures) involves the solution of a set of "n"
linear equations for each pixel, where n is the number of spectral bands. The result of
the solution is a set of fractional abundances for each material within the single pixel.
The ability to perform linear unmixing on hyperspectral data allows analysts to
identify materials or objects within a given scene, that are not necessarily resolved in
the image. This is an example of what is known as "Non-literal" analysis, in contrast
to literal analysis where objects are identified by eye. For intimate mixtures of
granular materials, nonlinear unmixing techniques are applied.
The high spectral resolution of hyperspectral sensors allows the clear identification of
the "red edge" feature of healthy vegetation. This feature is the result of the high
reflectivity in the near-IR and absorption in the red spectral bands. Vegetation that is
stressed will show higher reflectivity in the Shortwave-IR portion of the spectrum. A
complete understanding of the high-resolution spectral signature of vegetation
involves the particular state of the cell structure, water content, biochemicals, and
pigments within the vegetation. Healthy vegetation will absorb in both the blue and
red bands, giving rise to what is called the "green bump of healthy vegetation". As
vegetation is stressed, or as the vegetation's chlorophyll content changes, the "green
bump" feature will change, along with the reflectivity in the near-IR and shortware-IR
portions of the spectrum. When viewed using the standard false-color composite
(Near-IR/Red/Green composite), heathy vegetation will show up as deep red). A
quantitative measurement of the health and density of vegetation is carried out using
the Normalized Difference Vegetation Index or NDVI, a contrast ratio using a red and
near-IR spectral band, (NIR-Red)/(NIR+Red). Index values can range between -1.0
and 1.0, but vegetation has values that typically range between 0.1 and 0.7.
The images below show how hyperspectral imaging (in this case data obtained from
the Hyperion spaced based sensor) can be used to image burn scars and hot spots
(seen as orange and bright orange spots on the right image) through smoke resulting
from wildfires. The smoke is more transparent in the SWIR bands than in the VNIR
bands. Using a contrast ratio of two different SWIR bands, a Burn Index (BI) can be
created to measure the severity of burn scars.
The spectral maps below show an example of mineral mapping, one of the major
applications of hyperspectral imaging where high spectral resolution is necessary to
identify specific minerals from their unique absorption features produced by the
interaction of radiation with the mineral's unqiue crystalline structure. In this example,
a Matched filter was used along with a USGS reference spectrum of the water-
alteration mineral Kaolinite, to detect is location at Cuprite, Nevada. In the MF
detection map, the white areas indicate the presence of Kaolinite. The Minimum
Noise Transform (shown in lower left image) reveals the diversity of minerals at the
Cuprite, Nevada calibration test site. The top left pane shows the difference between
the USGS reference spectrum (blue line) and the actual AVIRIS spectrum (red line).
The fit to the specific absorption doublet feature at slightly less than 2.2 microns
indicates the identification of the mineral Kaolinite. The SWIR portion of the
spectrum between 2.0 and 2.5 microns is most commonly used to map minerals.
The following slide shows Matched Filter detections of three different alteration
minerals at the Cuprite, Nevada site. Kaolinite, Alunite, and Buddingtonite are shown
as different color overlays on top of a single baseline SWIR band.
Hyperspectral imaging is especially useful for assessing environmental disasters, such
as the 2010 Gulf Oil Spill. The location of oil slicks floating on the surface of ocean
water can be identified using several unique absorption bands due to the C-H bond of
the hydrocarbon. Small amounts of oil are sensitive to the 2.3-micron absorption
feature, which is caused by different rotational modes of the hydrocarbon molecule.
Thicker amounts of oil are sensitive to the 1.73-micron absorption feature, which is
the result of the hydrocarbon molecule's strech mode. In contrast to multispectral
imaging, which can locate oil slicks by their distinctive color on ocean water,
hyperspectral imaging allows a quantitative assessment of the amount of oil present.

There are also many military applications of hyperspectral imaging. The high spectral
resolution of hyperspectral sensors allows one to discriminate not only camouflage
from background clutter, but different types of camouflage. Note the common spectral
feature of two types of camouflage. They all "mimic" the red edge of vegetation, so
they would all appear to blend in with background vegetation if they were imaged
using conventional NIR/Red/Green multispectral imaging systems. However,
hyperspectral imaging systems with expanded spectral coverage in addition to higher
spectral resolution can differentiate the different types of camouflage, especially when
examined in the SWIR portion of the spectrum. The SWIR bands also allow the
discrimination between the two types of camouflage and the background vegetation.
In conclusion, every feature within a given scene has unique spectral properties due to
its molecular structure, and the way that molecular structure interacts with radiation to
cause reflective or emissive signatures. The LWIR and to some extent the MWIR are
known as the "finger print" region of the spectrum for identifying gases and effluents.
Spatial resolution is not as important as spectral resolution in hyperspectral imaging
applications, since sub-pixel analysis using various advanced mathematical methods is
possible. Derivative spectroscopy is a hot topic of research within the hyperspectral
imaging community. It is used to enhance/amplify very minor details in spectral
signatures. The future of hyperspectral imaging technology is leaning towards the use
of active hyperspectral imaging techniques, where the imaging system provides its
own source of controlled illumination. This technique promises to reduce or eliminate
problems associated with solar illumination artifacts and shadows encountered with
today's conventional hyperspectral imaging systems.

Future space-based optical interferometers equipped with imaging spectrometers will


be able to obtain integrated spectra of the full disk of earth-like exoplanets to search
for biosignatures in their atmospheres. Potential biosignature gases include
Water 2.7 microns, 6.3 microns, 19.51 microns
Nitrous Oxide 3.8 microns, 4.5 microns, 7.78 microns, 17 microns
Methane 3.3 microns, 7.7 microns
Ozone 9.65 microns
Oxygen 0.69 microns, 0.76 microns, 1.26 microns
Carbon Dioxide 2.7 microns, 4.3 microns, 15 microns
Carbon Monoxide 4.7 microns
Nitric Acid 11.5 microns
Chlorophyll a 6.76 microns
Other potential biosignatures include Ammonia, Sulfur Dioxide, H2S, CH3OH,
CH3Cl, and DMS

Solid Signatures on Rocky Planets

H2O Ice 1.25, 1.5, 2.0 microns

Silicates 1.0, 2.0 microns (broad)

Ferric Oxides 1.0 microns

Carbonates 2.35, 2.5 microns

Hydrated Silicates: 3.0-3.5 microns (broad)

Pigments in Earth-sized planets orbiting stars somewhat brighter than the Sun could
absorb blue (450 nm) and reflect yellow, orange, red, or a combination of these
colors.For stars cooler than the Sun (M Type), evolution might favor photosynthetic
pigments to pick up the full range of visible and IR light. With little light reflected,
plants might look dark to human eyes. The red edge spectral position could be shifted
for other Earth-like planets with a different parent star.

Photosynthesis on Earth produces the most detectable signs of life at the global scale.
The presence of oxygen or ozone in an atmosphere simultaneously with reduced gases
like methane is considered a robust biosignature (Des Marais et al., 2002). A
challenging, complementary observation to atmospheric oxygen would be detection of
the vegetation red edge - the strong contrast in red absorbance and near-infrared
reflectance of plant leaves due to green chlorophyll. Although the reason for the
placement of the Earth’s rededge at 0.7 microns is still not fully explained, scientists
have proposed it is due to the function of chlorophyll a (Björn et al. 2009).

Terrestrial biosignatures resulting from biological species include a disequilibrium in


atmospheric gas species, the red-edge of plant life due to the enhanced reflectivity in
the near-IR and strong absorption in the red, and biosignatures that vary with time,
such as seasonal variations in atmospheric composition and/or surface albedo.

Any small amount of molecular oxygen in an earth-like planet's atmosphere produced


by photolysis of water vapor is consumed thrugh oxidation of surface rocks and
volcanic gases. Thus, if oxygen and liquid water are simultaneously observed in a
spectrum, there must be some additional source producing the oxygen. The most
likely source would be oxygenic photosynthesis. If ozone and liquid water are seen in
a spectrum, it would be a very strong biosignature. The formation of ozone (O3)
requires the presence of oxygen in the planet's atmosphere, since UV radiation
dissociates molecular oxygen, which then recombines to form ozone. Ozone has a
spectral signature in the infrared part of the spectrum, making it easier to detect than
oxygen (which is detected at visible wavelengths). If both oxygen and methane are
detected together, it is a strong indication that photosynthesis is occurring. Also, if
imaging spectroscopy detects a seasonal trend (variation) of methane abundances, it is
an indication of life because methane levels will eventually begin to decrease due to
dissociation from stellar radiation. Methyl chloride might be an indicator of burning
plant life due to fires. It is also due to an interaction between sunlight and ocean
plankton and chlorine in seawater on Earth. However, oxidation acts as a sink and its
signature may be too weak to detect. Nitrous oxide is released as vegetation decays on
Earth. Nitrogen is released in the form of nitrous oxide. Since abiotic sources of this
gas (lightning, etc.) are negligible, it could be used as a possible biosignature.
The following diagrams show simulated (synthetic) spectra of earth-like planets. The
spectra cover the spectral range from 0.5 to 20 microns. Many of the discussed
biosignatures are visible in the spectra. The bottom spectra are for an Earth-sized
planet (1 g) around the Sun (black), AD Leo (red), M0 dwarf star (green), M5 dwarf
star (blue), and an M7 dwarf (magenta). The following spectra are courtesy of H.
Rauer et al.: Potential Biosignatures in super-Earth Atmospheres. Astronomy &
Astrophysics, February 16, 2011.
The three spectra shown below for Venus, Earth, and Mars illustrate the effect of life
on Earth. All three terrestrial planets shown strong absorption in their atmospheres
due to carbon dioxide. Only the Earth's atmosphere shows two biosignatures due to
life, water and ozone.
The "red edge" feature of vegetation is another possible biosignature. However, the
red edge may shift due to different types of plant life and the spectral class of the host
star. Photosynthesis from plant life produces molecular oxygen. The dissociation of
H2O by UV photons, which produces O2, appears to be an inorganic process. On
Earth, oxygen is stored in the atmosphere, and part of it is destroyed by oxidation of
anoxidized rocks, freshly delivered by tectonic activity. However, the oxygen content
of Earth's atmosphere is significant (about 20%) because of the dominance of oxygen
production from biogenic sources. The ratio of O3/O2 is an indicator of the degree of
evolution of biological activities on an earth-like planet. Ozone is produced by the
photolysis of oxygen. Ozone is a good tracer of oxygen in the atmosphere of an Earth-
like world. The spectroscopic detection of molecular oxygen and a reduced gas
(methane or nitrous oxide) provides very strong evidence for the presence of life on an
Earth-like planet.
During the 1990 flyby of Earth of the NASA Galileo spacecraft, Carl Sagan (1993)
carried out a controled experiment using the NIMS instrument to look for
biosignatures on Earth. The spectrometer found abundant molecular oxygen in
atmosphere, a sharp absorption edge in the red part of the visible spectrum due to
vegetation, and atmospheric methane in extreme thermodynamic disequilibrium. All
of these biosignatures are highly suggestive of life on Earth. The spacecraft also found
evidence of intelligent life from the presence of narrowband, pulsed, amplitude-
modulated radio transmissions.
Back to Astrobiology page

Вам также может понравиться