spectral imaging for remote sensing · 2007-05-02 · spectral imaging for remote sensing of...

26
VOLUME 14, NUMBER 1, 2003 LINCOLN LABORATORY JOURNAL 3 Spectral Imaging for Remote Sensing Gary A. Shaw and Hsiao-hua K. Burke Spectral imaging for remote sensing of terrestrial features and objects arose as an alternative to high-spatial-resolution, large-aperture satellite imaging systems. Early applications of spectral imaging were oriented toward ground-cover classification, mineral exploration, and agricultural assessment, employing a small number of carefully chosen spectral bands spread across the visible and infrared regions of the electromagnetic spectrum. Improved versions of these early multispectral imaging sensors continue in use today. A new class of sensor, the hyperspectral imager, has also emerged, employing hundreds of contiguous bands to detect and identify a variety of natural and man-made materials. This overview article introduces the fundamental elements of spectral imaging and discusses the historical evolution of both the sensors and the target detection and classification applications. O , vision plays a central role in human perception and interpretation of the world. When we hear a loud crash, smell something burning, or feel something slipping out of our grasp, our first response is visual—we look for the source of the trouble so we can assess and respond to the situation. Our eyes and brain can quickly provide detailed information about whatever event is occur- ring around us, which leads to a choice of appropriate action or response. The importance of human visual perception is also apparent when we consider that vi- sion processing consumes a disproportionately large part of human brain function. It is therefore not sur- prising that, historically, much of our success in re- mote sensing, whether for civilian, military, terres- trial, or extraterrestrial purposes, has relied upon the production of accurate imagery along with effective human interpretation and analysis of that imagery. Electro-optical remote sensing involves the acqui- sition of information about an object or scene with- out coming into physical contact with that object or scene. Panchromatic (i.e., grayscale) and color (i.e., red, green, blue) imaging systems have dominated electro-optical sensing in the visible region of the electromagnetic spectrum. Longwave infrared (LWIR) imaging, which is akin to panchromatic im- aging, relies on thermal emission of the objects in a scene, rather than reflected light, to create an image. More recently, passive imaging has evolved to in- clude not just one panchromatic band or three color bands covering the visible spectrum, but many bands—several hundred or more—encompassing the visible spectrum and the near-infrared (NIR) and shortwave infrared (SWIR) bands. This evolution in passive imaging is enabled by advances in focal-plane technology, and is aimed at exploiting the fact that the materials comprising the various objects in a scene reflect, scatter, absorb, and emit electromagnetic ra- diation in ways characteristic of their molecular com- position and their macroscopic scale and shape. If the radiation arriving at the sensor is measured at many wavelengths, over a sufficiently broad spectral band, the resulting spectral signature, or simply spectrum, can be used to identify the materials in a scene and discriminate among different classes of material. While measurements of the radiation at many wavelengths can provide more information about the materials in a scene, the resulting imagery does not

Upload: others

Post on 04-Apr-2020

9 views

Category:

Documents


0 download

TRANSCRIPT

• SHAW AND BURKESpectral Imaging for Remote Sensing

VOLUME 14, NUMBER 1, 2003 LINCOLN LABORATORY JOURNAL 3

Spectral Imaging forRemote SensingGary A. Shaw and Hsiao-hua K. Burke

■ Spectral imaging for remote sensing of terrestrial features and objects arose asan alternative to high-spatial-resolution, large-aperture satellite imaging systems.Early applications of spectral imaging were oriented toward ground-coverclassification, mineral exploration, and agricultural assessment, employing asmall number of carefully chosen spectral bands spread across the visible andinfrared regions of the electromagnetic spectrum. Improved versions of theseearly multispectral imaging sensors continue in use today. A new class of sensor,the hyperspectral imager, has also emerged, employing hundreds of contiguousbands to detect and identify a variety of natural and man-made materials. Thisoverview article introduces the fundamental elements of spectral imaging anddiscusses the historical evolution of both the sensors and the target detectionand classification applications.

O , vision plays a central rolein human perception and interpretation ofthe world. When we hear a loud crash, smell

something burning, or feel something slipping out ofour grasp, our first response is visual—we look for thesource of the trouble so we can assess and respond tothe situation. Our eyes and brain can quickly providedetailed information about whatever event is occur-ring around us, which leads to a choice of appropriateaction or response. The importance of human visualperception is also apparent when we consider that vi-sion processing consumes a disproportionately largepart of human brain function. It is therefore not sur-prising that, historically, much of our success in re-mote sensing, whether for civilian, military, terres-trial, or extraterrestrial purposes, has relied upon theproduction of accurate imagery along with effectivehuman interpretation and analysis of that imagery.

Electro-optical remote sensing involves the acqui-sition of information about an object or scene with-out coming into physical contact with that object orscene. Panchromatic (i.e., grayscale) and color (i.e.,red, green, blue) imaging systems have dominatedelectro-optical sensing in the visible region of the

electromagnetic spectrum. Longwave infrared(LWIR) imaging, which is akin to panchromatic im-aging, relies on thermal emission of the objects in ascene, rather than reflected light, to create an image.

More recently, passive imaging has evolved to in-clude not just one panchromatic band or three colorbands covering the visible spectrum, but manybands—several hundred or more—encompassing thevisible spectrum and the near-infrared (NIR) andshortwave infrared (SWIR) bands. This evolution inpassive imaging is enabled by advances in focal-planetechnology, and is aimed at exploiting the fact thatthe materials comprising the various objects in a scenereflect, scatter, absorb, and emit electromagnetic ra-diation in ways characteristic of their molecular com-position and their macroscopic scale and shape. If theradiation arriving at the sensor is measured at manywavelengths, over a sufficiently broad spectral band,the resulting spectral signature, or simply spectrum,can be used to identify the materials in a scene anddiscriminate among different classes of material.

While measurements of the radiation at manywavelengths can provide more information about thematerials in a scene, the resulting imagery does not

• SHAW AND BURKESpectral Imaging for Remote Sensing

4 LINCOLN LABORATORY JOURNAL VOLUME 14, NUMBER 1, 2003

lend itself to simple visual assessment. Sophisticatedprocessing of the imagery is required to extract all ofthe relevant information contained in the multitudeof spectral bands.

In this issue of the Lincoln Laboratory Journal wefocus attention on spectral measurements in the solar-reflectance region extending from 0.4 to 2.5 µm, en-compassing the visible, NIR, and SWIR bands. Thesethree bands are collectively referred to as the VNIR/SWIR. The measurement, analysis, and interpreta-tion of electro-optical spectra is known as spectroscopy.Combining spectroscopy with methods to acquirespectral information over large areas is known as im-aging spectroscopy. Figure 1 illustrates the concept ofimaging spectroscopy in the case of satellite remotesensing.

Fundamentals of Spectral Imaging

Throughout this special issue of the Journal we referto the illumination conditions in a scene as well as thereflectance properties of materials and surfaces in thatscene. Irradiance refers to the light energy per unittime (power) impinging on a surface, normalized bythe surface area, and is typically specified in watts persquare meter (W/m2). Reflectance is a unitless numberbetween 0 and 1 that characterizes the fraction of in-cident light reflected by a surface. Reflectance may befurther qualified by parameters such as the wave-length of reflected light, the angle of incidence, andthe angle of reflection. Radiance is an important re-lated concept that does not distinguish between thelight illuminating a surface or the light reflected from

Spectral images taken simultaneously

WavelengthR

efle

ctan

ce

Wavelength

Wavelength

Vegetation

Water

Soil

Each pixel containsa sampled spectrumthat is used to identifythe materials present inthe pixel by their reflectance

Swath width

Spe

ctra

ldi

men

sion

Alo

ng-trac

k dim

ensi

on built

up

by the m

otion o

f the s

pacec

raft

Earthsurface

Spaceborne hyperspectral sensor

Swath width of imaging sensor

Ref

lect

ance

Ref

lect

ance

FIGURE 1. The concept of imaging spectroscopy. An airborne or spaceborne imaging sensor simul-taneously samples multiple spectral wavebands over a large area in a ground-based scene. After ap-propriate processing, each pixel in the resulting image contains a sampled spectral measurement ofreflectance, which can be interpreted to identify the material present in the scene. The graphs in thefigure illustrate the spectral variation in reflectance for soil, water, and vegetation. A visual represen-tation of the scene at varying wavelengths can be constructed from this spectral information.

• SHAW AND BURKESpectral Imaging for Remote Sensing

VOLUME 14, NUMBER 1, 2003 LINCOLN LABORATORY JOURNAL 5

a surface. Radiance is simply the irradiance normal-ized by the solid angle (in steradians) of the observa-tion or the direction of propagation of the light, andis typically measured in W/m2/steradian. Normaliz-ing the radiance by the wavelength of the light, whichis typically specified in microns (µm), yields spectralradiance, with units of W/m2/µm/steradian.

Reflectance Spectrum

We are accustomed to using color as one of the wayswe distinguish and identify materials and objects. Thecolor and reflectivity of an object are typically impor-tant indications of the material composition of theobject, since different materials absorb and reflect theimpinging light in a wavelength-dependent fashion.To a first order, the reflected light, or spectral radianceLs ( )λ , that we see or that a sensor records is the prod-uct of the impinging scene radiance Li ( )λ and thematerial reflectance spectrum ρ λ( ), both of whichvary as a function of wavelength λ:

L Ls i( ) ( ) ( ) .λ ρ λ λ= (1)

If the illumination spectrum is known, then the ma-terial reflectance spectrum, or reflectivity, can in prin-ciple be recovered from the observed spectral radianceover those regions of the spectrum in which the illu-

mination is nonzero. Since the reflectance spectrum isindependent of the illumination, the reflectance spec-trum provides the best opportunity to identify thematerials in a scene by matching the scene reflectancespectra to a library of known spectra.

In the case of solar illumination, the spectral irradi-ance of the light reaching the atmosphere is reason-ably well characterized, as shown in Figure 2. Equa-tion 1, however, oversimplifies the relation betweenreflectance and illumination. Many environmentaland sensing phenomena can complicate the recoveryof the reflectance spectra. For example, even thoughthe spectrum of the solar radiation reaching the atmo-sphere is well characterized, the spectrum of the solarradiation reaching the ground is altered in a tempo-rally and geographically dependent fashion because ofpropagation of solar radiation through the earth’sconstantly changing atmosphere. Such atmosphericmodulation effects must be accounted for in order toreliably recover the reflectance spectra of materials onthe ground in a sunlit scene.

Sensor errors can further impede the recovery ofreflectance spectra by distorting and contaminatingthe raw imagery. For example, focal-plane vibrationcan result in cross-contamination of adjacent spectralbands, thus distorting the observed spectrum. A com-prehensive discussion of sensor errors and artifacts isbeyond the scope of this article. The next section,however, provides an overview of the image forma-tion process.

Image Formation and Area Coverage Rate

Collecting two-dimensional spatial images over manynarrow wavebands with a two-dimensional focal-plane imaging sensor typically involves some form oftime-sequenced imaging. This collection can be ac-complished by either a time sequence of two-dimen-sional spatial images at each waveband of interest, or atime sequence of spatial-spectral images (one-dimen-sional line images containing all the wavebands ofinterest), with multiple one-dimensional spatial im-ages collected over time to obtain the second spatialdimension.

A variety of techniques have been developed to col-lect the required data. A common format for data col-lection is a push-broom imaging sensor, in which a

Irra

dian

ce (W

/cm

2 - m

0.20

0.15

0.10

0.05

0.000.5 1.0 1.5 2.0 2.5

Wavelength ( m)µ

Top of atmosphere

Ground level

0.25

FIGURE 2. Solar spectral irradiance curves at the top of theatmosphere and at ground level. The solar irradiance out-side the atmosphere (black curve) is well characterized. Atground level (red curve) it is altered by the absorption andscattering effects of the atmosphere. The recovery of reflec-tance spectra of different objects and materials must takeinto account the effects the atmosphere has on the spec-trum of both the solar illumination and the reflected light.

• SHAW AND BURKESpectral Imaging for Remote Sensing

6 LINCOLN LABORATORY JOURNAL VOLUME 14, NUMBER 1, 2003

cross-track line of spatial pixels is decomposed into Kspectral bands. Figure 3 illustrates the geometry forthis type of data-collection system. The spectral de-composition can be accomplished by using any ofseveral mechanisms, such as a diffraction grating or awedge filter [1].

Historically, spectral imaging for remote sensingcan be traced back to the Television Infrared Observa-tion Satellite (TIROS) series first launched in 1960[2]. The TIROS legacy has continued with the Ad-vanced Very High Resolution Radiometer (AVHRR)and the Moderate Resolution Imaging Spectroradi-ometer (MODIS) instruments [3], launched aboardthe Terra (Latin for earth) spacecraft in 1999. Thespatial resolution of AVHRR is four kilometers, andthe spatial resolution of the multispectral MODISsensor varies from 250 meters in some spectral bandsto one kilometer in others. In comparison, the focusof the articles in this issue of the Journal is on sensorswith spatial resolutions better than thirty meters, andin some cases better than one meter.

Sampling

There are four sampling operations involved in thecollection of spectral image data: spatial, spectral, ra-diometric, and temporal. In the systems discussed inthis issue, we assume the spatial resolution is identicalto the ground sample distance (GSD), although ingeneral the GSD could be smaller than the spatialresolution. For the systems described in this issue, the

GSD varies from a fraction of a meter to tens ofmeters, and is established primarily by the sensor ap-erture and platform altitude. Platform altitude isloosely constrained by the class of sensor platform(e.g., spaceborne versus airborne).

As noted earlier, spectral sampling is achieved bydecomposing the radiance received in each spatialpixel into a number of wavebands. The wavebandsmay vary in resolution, and may be overlapping, con-tiguous, or disparate, depending upon the design ofthe sensor. A color image, consisting of red, green,and blue bands, is a familiar example of a spectralsampling in which the wavebands (spectral channels)are non-overlapping and relatively broad.

An analog-to-digital converter samples the radi-ance measured in each spectral channel, producingdigital data at a prescribed radiometric resolution.The result is a three-dimensional spectral data cube,represented by the illustration in Figure 4. Figure 4(a)shows the spectra in cross-track scan lines being gath-ered by an airborne sensor. Figure 4(b) shows the scanlines being stacked to form a three-dimensionalhyperspectral data cube with spatial information inthe x and y dimensions and spectral information inthe z dimension. In Figure 4(b) the reflectance spectraof the pixels comprising the top and right edge of theimage are projected into the z dimension and colorcoded according to the amplitude of the spectra, withblue denoting the lowest-amplitude reflectance valuesand red denoting the highest values.

FIGURE 3. (a) Geometry of a push-broom hyperspectral imaging system. The area coverage rate is the swath width times theplatform ground velocity v. The area of a pixel on the ground is the square of the ground sample distance (GSD). (b) An imag-ing spectrometer on the aircraft disperses light onto a two-dimensional array of detectors, with ny elements in the cross-track(spatial) dimension and K elements in the spectral dimension, for a total of N = K × ny detectors.

Spatial

Scan line

Moving sensor platform

Scan line (ny pixels)

Platform groundvelocity = v Objective

optics

Entrance slit

Collimating optics

Dispersing element

Focusing optics

Two-dimensional focal-plane array with ny × K detectors

x

z

y

Ground pixel area =GSD × GSD

Spatial-spectral images from pastground positions

Spe

ctra

l

Image swath width

x

y

(a) (b)

• SHAW AND BURKESpectral Imaging for Remote Sensing

VOLUME 14, NUMBER 1, 2003 LINCOLN LABORATORY JOURNAL 7

Spatialdimensions

Tree

Fabric

Paint

Grass

100–0–200 spectral channels

Grass

Grass

Roads

Man-made objects

TTTrTTT eerereTTT eeTT

Moving sensor platform

Scan line(1 × ny pixels)

essseeseeee

Ref

lect

ance

Water-vapor absorption bands

Grass

Tree

Fabric

PaintPP

0.4 mµ Wavelength( )λ 2.5 mµ

Three-dimensionalhypercube is

assembled by stackingtwo-dimensionalspatial-spectral

scan lines

(a) (b)

(c)

(d)

Successive scan lines

Spatial pixels

Spectral channels

x

y

z

y

x

FIGURE 4. Structure of the hyperspectral data cube. (a) A push-broom sensor on an airborne or spaceborne platform collectsspectral information for a one-dimensional row of cross-track pixels, called a scan line. (b) Successive scan lines comprised ofthe spectra for each row of cross-track pixels are stacked to obtain a three-dimensional hyperspectral data cube. In this illustra-tion the spatial information of a scene is represented by the x and y dimensions of the cube, while the amplitude spectra of thepixels are projected into the z dimension. (c) The assembled three-dimensional hyperspectral data cube can be treated as astack of two-dimensional spatial images, each corresponding to a particular narrow waveband. A hyperspectral data cube typi-cally consists of hundreds of such stacked images. (d) Alternately, the spectral samples can be plotted for each pixel or foreach class of material in the hyperspectral image. Distinguishing features in the spectra provide the primary mechanism for de-tection and classification of materials in a scene.

• SHAW AND BURKESpectral Imaging for Remote Sensing

8 LINCOLN LABORATORY JOURNAL VOLUME 14, NUMBER 1, 2003

Secondaryillumination

Atmospheric absorption

and scattering

Upwelling radiance

Spatial resolutionand viewing angle

of the sensor

Illumination angle of the sun

Image pixel projection

FIGURE 5. Atmospheric and scene-related factors that cancontribute to degradations in the imaging process. The spa-tial resolution of the sensor and the degree of atmosphericscattering and absorption are the most significant contribu-tors to diminished image quality.

The spectral information in the z dimension mightbe sampled regularly or irregularly, depending on thedesign of the sensor. Spectral image data can beviewed as a stack of two-dimensional spatial images,one image for each of the sampled wavebands, asshown in Figure 4(c), or the image data can be viewedas individual spectra for given pixels of interest, asshown in Figure 4(d). (Note: In Figure 4, color inparts c and d represents spectral bands rather than re-flectance amplitudes.)

While there is an integration time τd associatedwith the image formation process, we use the termtemporal sampling to refer not to the time associatedwith image formation, but to the process of collectingmultiple spectral images of the same scene separatedin time. Temporal sampling is an important mecha-nism for studying natural and anthropogenic changesin a scene [4, 5].

Practical Considerations

The concept of spectral imaging, as illustrated in Fig-ure 1, appears straightforward. There are many prac-tical issues, however, that must be addressed in thedesign and implementation of a spectral imaging sys-tem. These issues include the spatial and spectral reso-lution of the sensor, atmospheric effects such as ab-sorption and scattering, the spectral variability ofsurface materials in the scene, and other environmen-tal effects such as viewing angle, secondary illumina-tion, and shadowing. Figure 5 illustrates a number ofthese salient issues for a representative scene.

Spatial Resolution

Sensor cost is usually a strong function of aperturesize, particularly for spaceborne systems. Reducingthe aperture size reduces sensor cost but results in de-graded spatial resolution (i.e., a larger GSD). For aspectral imager, the best detection performance is ex-pected when the angular resolution of the sensor,specified in terms of the GSD, is commensurate withthe footprint of the targets of interest. Targets, how-ever, come in many sizes. Consequently, for a givensensor design, some targets may be fully resolved spa-tially, while others may fill only a fraction of the GSDfootprint that defines a pixel. Therefore, detectionand identification algorithms must be designed to

function well regardless of whether targets are fully re-solved (i.e., fully fill a pixel) or comprise only a frac-tion of the material in a given pixel (i.e., subpixel).

Atmospheric Effects

The atmosphere absorbs and scatters light in a wave-length-dependent fashion. This absorption and scat-tering has several important implications that causedifficulties for sensor imaging. Four of these difficul-ties are described below.

First, the atmosphere modulates the spectrum ofthe solar illumination before it reaches the ground,and this modulation must be known or measured inorder to separate the spectrum of the illumination(the impinging solar radiance) from the reflectivity(the reflectance spectrum) that characterizes thematerials of interest in the scene. Figure 6 shows theone-way total atmospheric transmission curve for twodifferent water vapor and aerosol conditions, corre-sponding to extremes of wet and dirty versus cleanand dry atmospheres. Figure 6 also shows the contri-

• SHAW AND BURKESpectral Imaging for Remote Sensing

VOLUME 14, NUMBER 1, 2003 LINCOLN LABORATORY JOURNAL 9

butions of well-mixed gases, aerosols, and water vaporto the overall transmission. As the solar-illuminationangle and the viewing angle of the sensor change, thetotal path through the atmosphere changes, which inturn affects the total atmospheric transmission. Also,the water-vapor distribution (as well as the aerosolcharacteristics of the atmosphere) varies with locationand time, so the methods of compensating for theseeffects must be scene based.

Second, some of the solar radiation is scattered bythe atmosphere into the field of view of the sensorwithout ever reaching the ground. This scattered lightis superimposed on the reflected light arriving fromthe scene, and is termed path radiance because it ap-pears along the line-of-sight path to the scene. Third,the solar radiation scattered by the atmosphere, pre-dominantly in the blue region of the visible spectrum,acts as a secondary source of diffuse colored illumina-tion. This diffuse sky illumination is most importantfor shadowed objects, since regions shadowed fromthe direct rays of the sun may still be illuminated bythe diffuse non-white sky radiation. Fourth, the solar

illumination that reaches the scene and is reflected bythe target is further absorbed and scattered by the at-mosphere as it propagates toward the sensor.

A number of methods are currently used to esti-mate and compensate for these atmosphere propaga-tion effects. The appendix entitled “Fundamentals ofAtmospheric Compensation” provides additional de-tail on these compensation methods.

Other Environmental Effects

In addition to atmospheric absorption and scattering,several other prominent environmental parametersand associated phenomena have an influence on spec-tral imaging. The sun angle relative to zenith, the sen-sor viewing angle, and the surface orientation of thetarget all affect the amount of light reflected into thesensor field of view. Clouds and ground cover maycast shadows on the target, substantially changing theillumination of the surface. Nearby objects may alsoreflect or scatter sunlight onto the target, superimpos-ing various colored illuminations upon the dominantdirect solar illumination, as illustrated in Figure 5.

FIGURE 6. Effect of atmospheric gases, aerosols, and water vapor on total atmospheric transmission.The green curves represent the case of modest influence (0.4 cm col of water vapor), which correspondsto a rural environment with a visual range of 23 km. The red curves represent the case of strong influence(4.1 cm col of water vapor), which corresponds to an urban environment with a visual range of 5 km.

Total

0.5 1.0 1.5 2.0 2.5

0.8

0.6

0.4

0.2

0

1.0

Gases

CO2

CO2O2

O2

O3CH4

0.5 1.0 1.5 2.0 2.5

0.8

0.6

0.4

0.2

0

1.0Aerosols

0.5 1.0 1.5 2.0 2.5

0.8

0.6

0.4

0.2

0

1.0H2O

0.5 1.0 1.5 2.0 2.5

0.8

0.6

0.4

0.2

0

1.0

Wavelength ( m) Wavelength ( m) Wavelength ( m)

Wavelength ( m)

Tra

nsm

issi

on

Tra

nsm

issi

on

µ

µ µ µ

• SHAW AND BURKESpectral Imaging for Remote Sensing

10 LINCOLN LABORATORY JOURNAL VOLUME 14, NUMBER 1, 2003

0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 2 2.2 2.40

0.1

0.2

0.3

0.4

0.5 Full pixels (114)

Wavelength ( m) µ

Ref

lect

ance

Mean spectrum

FIGURE 7. Example of variability in reflectance spectra measured over multiple instances of a given material (inthis case, vehicle paint) in a scene. The shapes of the spectra are fairly consistent, but the amplitudes vary con-siderably over the scene. To exploit this spectral shape invariance, some detection algorithms give more weightto the spectral shape than to the spectral amplitude in determining whether a given material is present in a pixel.The gaps correspond to water-vapor absorption bands where the data are unreliable and are discarded.

Spectral Variability

Early in the development of spectral imaging, re-searchers hypothesized that the reflectance spectrumof every material is unique and, therefore, represents ameans for uniquely identifying materials. The term“spectral signature,” which is still in use today, sug-gests a unique correspondence between a materialand its reflectance spectrum. In field data, however, aswell as laboratory data, we observe variability in thereflectance spectrum of most materials. Many mecha-nisms may be responsible for the observed variability,including uncompensated errors in the sensor, un-compensated atmospheric and environmental effects,surface contaminants, variation in the material suchas age-induced color fading due to oxidation orbleaching, and adjacency effects in which reflectionsfrom nearby objects in the scene change the apparentillumination of the material. Seasonal variations alsointroduce enormous changes in the spectral characterof a scene. We need only observe the changes in a de-ciduous forest over spring, summer, fall, and winterto appreciate the degree of spectral variability that canarise in the natural materials comprising a scene.

There is some reason to expect that man-made ma-terials exhibit less spectral variability than the natu-rally occurring materials in a scene. Figure 7 shows anumber of instances of reflectance spectra derived formultiple occurrences of fully resolved vehicle-paint

pixels in a scene. Note that while the shapes of thespectra are fairly consistent, the amplitude varies con-siderably over the scene. In an effort to exploit thespectral shape invariance, some of the more successfuldetection algorithms give more weight to the spectralshape than to the amplitude when determiningwhether a particular material is present in a pixel.

Motion and Sensor Artifacts

As indicated in Figure 3, spectral imaging sensorstypically exploit sensor platform motion as a means ofscanning a scene. However, nonlinear motion of thesensor can corrupt the spectral image by mixing to-gether the spectral returns from different parts of thespatial image. Motion of objects in the scene can alsocreate artifacts if the motion is on the order of theGSD during the integration time τd . This type of spa-tial migration can degrade image quality, but fortu-nately it is not as significant as the spectral migrationin terms of impact on subsequent processing for de-tection and identification of materials. The imagingof a scene, including detection and conversion of theradiance to a digital sequence, introduces a number ofartifacts such as thermal noise, quantization, and geo-metric distortion. The nature of these sensor artifacts,and methods for modeling them, are discussed inmore detail in the article entitled “Hyperspectral Im-aging System Modeling,” by John P. Kerekes andJerrold E. Baum.

• SHAW AND BURKESpectral Imaging for Remote Sensing

VOLUME 14, NUMBER 1, 2003 LINCOLN LABORATORY JOURNAL 11

Spatial versus Spectral Information

The emergence of radar sensors in the late 1930sopened the door to detection and discrimination ofradar-reflecting targets without relying on traditionalspatial imagery. Less than sixteen years after the intro-duction of the first operational radar, however, re-searchers were flying experimental side-looking air-borne radars (SLAR) in order to transform radarreturns into images [6]. Synthetic aperture radar(SAR) was devised to create images from two-dimen-sional range-Doppler radar reflectance maps [7]. In-terferometric SAR was developed later as a means ofgenerating three-dimensional radar images [8]. Theintroduction of light detection and ranging (lidar)technology provided another way to create three-di-mensional images in the form of high-resolutionangle-angle-range reflectance maps. Throughout theevolution of passive and active imaging methods, im-proved spatial resolution has been a mantra of devel-opment that has pushed optical sensors to larger aper-tures and radar sensors to higher frequencies.Automated detection and recognition algorithmshave evolved along with these sensors, but human vi-sual analysis continues to be the predominant meansof interpreting the imagery from operational imagingsensors.

Multispectral Imaging

Increasing the spatial resolution of imaging sensors isan expensive proposition, both in terms of the sensorand in terms of the amount of sensor data that mustsubsequently be processed and interpreted. Thetrained human analyst has traditionally carried themain burden of effective image interpretation, but astechnology continues to advance, the volume of high-resolution optical and radar imagery has grown at afaster rate than the number of trained analysts.

In the 1960s, the remote sensing community, rec-ognizing the staggering cost of putting large passiveimaging apertures in space, embraced the concept ofexploiting spectral rather than spatial features to iden-tify and classify land cover. This concept relies prima-rily on spectral signature rather than spatial shape todetect and discriminate among different materials ina scene. Experiments with line-scanning sensors pro-

viding up to twenty spectral bands in the visible andinfrared were undertaken to prove the concepts [9].This experimental work helped create interest andsupport for the deployment of the first space-basedmultispectral imager, Landsat-1 [10], which waslaunched in 1972 (Landsat-1 was originally desig-nated the Earth Resource Technology Satellite, orERTS 1). Landsat-7, the latest in this series of highlysuccessful satellites, was launched in 1999. A LincolnLaboratory prototype for a multispectral AdvancedLand Imager (ALI) was launched in November 2000aboard the National Aeronautics and Space Adminis-tration (NASA) Earth Observing (EO-1) satellite[11]. EO-1 also carries a VNIR/SWIR hyperspectralsensor, called Hyperion, with 220 spectral bands anda GSD of thirty meters.

Hyperspectral Imaging

On the basis of the success of multispectral sensing,and enabled by advances in focal-plane technology,researchers developed hyperspectral sensors to samplethe expanded reflective portion of the electromag-netic spectrum, which extends from the visible region(0.4 to 0.7 µm) through the SWIR (about 2.5 µm) inhundreds of narrow contiguous bands about ten na-nometers wide. The majority of hyperspectral sensorsoperate over the VNIR/SWIR bands, exploiting solarillumination to detect and identify materials on thebasis of their reflectance spectra.

If we consider the product of spatial pixels timesspectral bands (essentially the number of three-di-mensional resolution cells in a hypercube) to be ameasure of sensor complexity, then we can preservethe overall complexity in going from a panchromaticsensor (with one broad spectral band) to a hyperspec-tral sensor (with several hundred narrow spectralbands) by reducing the number of spatial pixels by afactor of several hundred while keeping the field ofview constant. In effect, a one-dimensional reductionin spatial resolution by a factor of approximately fif-teen (i.e., the square root of K, the number of spectralbands, which is typically about 220) compensates forthe increased number of spectral samples and reducesthe required aperture diameter by the same factor,thus reducing the potential cost of a sensor. However,while the total number of three-dimensional resolu-

• SHAW AND BURKESpectral Imaging for Remote Sensing

12 LINCOLN LABORATORY JOURNAL VOLUME 14, NUMBER 1, 2003

tion cells is preserved, the information content of theimage is generally not preserved when making suchtrades in spatial versus spectral resolution.

Area Coverage Rate

In many cases, a primary motivation for applyingspectral imaging to remote sensing is to reduce the re-quired spatial resolution—and thus the size andweight—of the sensor. Therefore, it is worthwhile,even in this brief overview, to consider the designtrade-offs involved in selecting the spatial and spectralresolution of a spectral imaging sensor. Table 1 sum-marizes the important spectral image formation pa-rameters in this discussion.

When the achievable signal-to-noise ratio (SNR) islimited by the imaging process in the sensor and notby noise (interference) in the scene, the SNR2 of thesensor grows in proportion to the product of the re-ceive aperture area d2, the area of a pixel on theground (which for the purpose of this simple analysiswe equate to the square of the GSD), the time inter-val τd over which the signal is integrated at the detec-tor, and the scene radiance Ls at the sensor:

SNR GSD2 2 2∝ × × ×d Ld sτ . (2)

The integration time is proportional to the GSD di-vided by the platform velocity v :

τd v∝

GSD.

In a spectral imaging sensor, the radiance signal fromeach pixel is decomposed into K spectral bands, sothat the average spectral radiance Lλ in each spectralband (detector) on the focal plane is

LLK

sλ ∝ .

Substituting for τd in Equation 2, and absorbing thescene-radiance scale factor into the proportionalitysign, the SNR proportionality at the detector can beexpressed as

SNRGSD2

2 3

∝××

dv K

. (3)

To relate this expression to area coverage rate (ACR),we note from Figure 3 that the swath width (SW) is

equal to the GSD times the number of cross-tracksamples:

SW GSD= × ( / ) ,N K

where N is the total number of detectors in the focalplane. Therefore, Equation 3 can be rewritten as

SNRGSD

SW2

2 4

2∝

× ×

× ×

d N

v K.

The achievable ACR is then given by

ACR SWGSD

SNR≡ × ∝

× ×

×v

d N

K

2 4

2 2.

By fixing the aperture size d and the number of detec-tors N on the focal plane, we find that the ACR isproportional to the square of the ground sample area,and inversely proportional to both the square of theSNR and the square of the number of spectral bands:

ACRGSD

SNR∝

×

4

2 2K. (4)

Spatial versus Spectral Trade-Space Example

The expression for ACR given in Equation 4 can beevaluated to further quantify the trade-off betweenspectral and spatial resolution. Consider, as a baselinefor comparison, a panchromatic imaging system(K = 1), with GSD = 0.3 m, SNR = 100, and ACR =30 km2/hr. In the following comparison we consider

Table 1. Spectral Image Formation Parameters

Aperture diameter d

Image swath width SW

Integration time τd

Number of cross-track elements ny

Number of spectral channels K

Number of detectors in focal plane N (= ny × K)

Scene radiance Ls

Sensor platform velocity v

Ground sample distance GSD

• SHAW AND BURKESpectral Imaging for Remote Sensing

VOLUME 14, NUMBER 1, 2003 LINCOLN LABORATORY JOURNAL 13

a 10-band system to be representative of a multispec-tral sensor, and a 100-band system to be representa-tive of a hyperspectral system.

According to Equation 4, increasing the number ofspectral bands while maintaining the GSD and detec-tor SNR constant results in a decrease in ACR. Sincea primary rationale for multispectral and hyperspec-tral imaging is to reduce reliance on high spatial reso-lution for detection and identification, the ACR canbe preserved while increasing the number of spectralbands by increasing the GSD (i.e., reducing the spa-tial resolution). According to Equation 4, one way topreserve the ACR when increasing the spectral bandsby K is to reduce the GSD by a factor of K1/2. How-ever, since each pixel is decomposed into K bands,detection processing such as matched filtering, which

combines the K bands for detection, can yield an im-provement in SNR of K1/2, which is the gain fromnoncoherent integration of K samples. Therefore, itmakes sense to reduce the SNR in each detector byK1/2 and increase the GSD by a smaller amount. Anincrease in GSD by K1/4 and a decrease in SNR byK1/2 preserves the overall ACR. A decrease in the pixelSNR by K1/2 along with an increase in GSD by K1/2

increases the ACR by a factor of K , relative to thebaseline panchromatic sensor. Figure 8 shows thesespectral-versus-spatial trades and the impact on ACR.

In Figure 8 the performance of the baseline pan-chromatic sensor (one spectral band) is representedby the convergence of the three curves at the normal-ized ACR of 1.0. As more spectral bands are added,the ACR can be preserved by letting the GSD grow in

0.1

1

10

100

1000

101 100 1000

Number of spectral bands

Fix aperture diameter, number of detectors, and platform velocity to obtain the area coverage rate (ACR):

GSD

SNR*

ACR

GSD

SNR*

ACR

GSD

SNR*

ACR

10 bands

1 m

32

300 km2/hr

1 m

56

100 km2/hr

0.5 m

32

30 km2/hr

100 bands

3 m

10

3000 km2/hr

3 m

32

300 km2/hr

1 m

10

30 km2/hr

GSD ∝ K 1/2

SNR ∝ 1/K 1/2

GSD ∝ K 1/2

SNR ∝ 1/K1/4

GSD ∝ K1/4

SNR ∝ 1/K1/2

Hyperspectral regime

GSD4

SNR2 × K2

Multi- spectral regime

ACR ∝

Panchromatic baseline

AC

R re

lativ

e to

bas

elin

e

1 band

0.3 m

100

30 km2/hr

0.3 m

100

30 km2/hr

0.3 m

100

30 km2/hr

*SNR in this table represents the average SNR per spectral band

FIGURE 8. Example of trade space for spatial, spectral, and area coverage rate (ACR). The plot on the left shows the improve-ment in ACR that can be achieved with a multiband spectral sensor by trading spatial resolution for spectral resolution. Thebaseline reference system is a single-band panchromatic imager with a resolution of 0.3 m and an ACR of 30 km2/hr. The redcurve represents the family of multiband imagers that achieve the same ACR as the baseline by simultaneously reducing thespatial resolution (i.e., increasing GSD) and the SNR in an individual detector. The GSD and SNR for a single-band panchro-matic system, a 10-band multispectral system, and a 100-band hyperspectral system are shown in the red box in the table to theright. The blue and green curves correspond to other trades in which the ACR of the multiband sensor is increased relative tothe panchromatic baseline. The table again provides specific values of GSD, SNR, and ACR for the case of a 10-band multi-spectral system and a 100-band hyperspectral system.

• SHAW AND BURKESpectral Imaging for Remote Sensing

14 LINCOLN LABORATORY JOURNAL VOLUME 14, NUMBER 1, 2003

proportion to K1/4 while the detector SNR is allowedto decrease in proportion to 1/K1/2, as represented bythe red curve. The companion table shows examplesof the resulting GSD and SNR for the cases whenK = 10 and K = 100, corresponding to representativemultispectral and hyperspectral sensors. The bluecurve represents a design trade in which the ACR isincreased by a factor of K1/2 over the panchromaticbaseline. Note from the table that the factor of K1/2 =10 improvement in ACR for K = 100 is achieved atthe cost of an SNR reduction per detector of K1/4 =3.2. Matched filtering of the spectral information in apixel may compensate for much of the reduction inSNR within the individual detectors.

High-Dimensional Data Representation

The manner in which spectral image data are pro-cessed is strongly influenced by the high dimensional-ity of the data. To illustrate the concept of data di-mensionality, we consider a multispectral image

sensor with only three wavebands, λ1, λ2, and λ3. Asin the high-dimensional case, the wavelength-depen-dent reflectivity of a material is used to characterizethe material. For this low-dimensional example, if anumber of different materials are measured, each ma-terial can be represented as a data point in a three-di-mensional space, as indicated by the blue dots in Fig-ure 9. Each unique point in this space represents aspectral signature defined by the 3-tuple (λ1, λ2, λ3).The spectral signature is thus seen as a vector oflength—or dimension—three. This representation ishighly compatible with many numerical processingalgorithms.

Dimensionality Reduction

Depending upon the nature of the data, it may bepossible to project the data cloud into a lower dimen-sional space while still preserving the separation anduniqueness of the data points. Figure 9 illustrates thisprocess, in which the three-dimensional representa-tion (indicated by blue dots) is projected onto a two-dimensional plane (indicated by red dots), thus re-ducing the dimensionality by one. If two or more ofthe blue dots project onto the same point in the two-dimensional plane, it is no longer possible to distin-guish the projected blue points as unique. In general,finding a projection that preserves the relevant infor-mation content of the data (separation) while reduc-ing dimensionality is a complicated undertaking. Thearticle in this issue by Nirmal Keshava entitled “ASurvey of Spectral Unmixing Algorithms” describesthis topic in greater detail.

The same concept of dimensionality illustrated inFigure 9 can be applied to hyperspectral sensors withhundreds of contiguous spectral wavebands. Unlikethe three-dimensional example, all the informationcontained in a high-dimensional data cloud cannotbe visualized by a two-dimensional or three-dimen-sional perspective plot. At best, we can construct two-dimensional and three-dimensional visualizations byprojecting the K-dimensional spectral vector into anappropriately chosen two-dimensional or three-di-mensional subspace.

An important implication of the dimension-reduc-tion example is that, unlike conventional panchro-matic and color imagery, higher-dimension hyper-

λ1

Original pointsProjected points

λ2

λ3

FIGURE 9. Example of data from a multispectral image sen-sor with only three wavebands, λ1, λ2, and λ3. In this three-dimensional example, the wavelength-dependent reflectivityof a material is measured in each of the three wavebandsand used to characterize the material. If a number of differ-ent materials are measured, each material can be repre-sented as a data point in a three-dimensional space, as indi-cated by the blue dots. Dimensionality reduction of the datais illustrated by projecting the data cloud into a lower dimen-sional space defined by the (λ1, λ2) plane. The projecteddata, represented by red dots, still preserve the uniquenessof the original data points, although the separation in Eu-clidean distance between the data points is reduced.

• SHAW AND BURKESpectral Imaging for Remote Sensing

VOLUME 14, NUMBER 1, 2003 LINCOLN LABORATORY JOURNAL 15

spectral imagery requires computer processing formaximum information extraction and visualization.This processing can be totally automated or it can beaccomplished with guidance from an image analyst.After processing is completed, the information can behighlighted, perhaps with pseudocolor, and displayedin a two-dimensional image, but the computer-pro-cessing step is an essential precursor to visualization.

How Many Dimensions Are Needed?

If the task is to discriminate between a particular tar-get material and a particular class of background, afew well-chosen wavebands are usually sufficient toseparate the target and background materials. If this istrue, we might well question the motivation for hy-perspectral sensing, in which hundreds of contiguousnarrow wavebands are measured by the sensor. Differ-ent materials, however, exhibit different spectral fea-tures. Certain paints and vegetation can be character-ized by broad, slowly varying spectral features. Othermaterials, such as minerals and gases, possess verynarrow spectral features, and the location of thesenarrow features in the spectral band differs for eachclass of material. Therefore, narrow wavebands maybe needed to resolve features that help differentiatesimilar spectra, and contiguous bands are needed tohandle the expected variety of materials, since impor-tant features may be in different spectral locations foreach material. In addition, the narrow wavebandsthat straddle the water-vapor absorption bands areimportant in estimating and correcting for the vari-able water vapor contained in the atmosphere.

Thus, if we were interested in only a few target ma-terials and backgrounds, a limited number of care-fully chosen narrow wavebands would suffice to re-cover the salient spectral features. As more types oftargets and backgrounds are added to the list, how-ever, the number and location of wavebands neededto discriminate between any given spectral pair growsrapidly. One possible solution would be to buildmany special-purpose sensors, each of which collectsonly a minimal set of wavebands needed for a limitedset of targets and backgrounds. A more cost-effectiveand robust solution is to build one type of sensor thatoversamples the spectral information, and to developapplication-specific algorithms, or a family of algo-

rithms, that remove the redundant or undesired spec-tral information while preserving the informationrelevant to a given application. The essence of hyper-spectral processing for detection and identification ofmaterials is the extraction of salient features and thesuppression of redundant or common features.

Spectral Imaging Applications

By oversampling the spectral information, we can ap-ply hyperspectral imaging sensors to a variety of dis-tinctly different problems, adapting the processingused to extract relevant information. There is muchvalue, however, in tailoring the spatial resolution andsensor field of view for the intended class of applica-tion. Higher spatial resolution provides an improvedsignal-to-background-interference ratio for spatiallysmall targets, usually at the cost of decreased sensorfield of view. In general, the spatial resolution of thesensor is chosen on the basis of the spatial extent ofthe primary targets of interest. Figure 10 presents asimplified taxonomy of the many different types ofhyperspectral imaging applications, identifying threemajor categories: anomaly detection, target recogni-tion, and background characterization.

Anomaly Detection

Anomaly detection is characterized by the desire tolocate and identify uncommon features in an image.These features could be man-made materials dis-persed in a natural background, or they could be de-partures of the natural features from the norm. Oneof the early applications of multispectral imaging wasto detect the spread of corn blight in the Midwest[10]. This application depends primarily on beingable to distinguish brown withered leaves interspersedin fields of healthy green corn stalks.

Target Recognition

Target recognition is distinguished from anomaly de-tection by the availability of some a priori informa-tion about the target. This information could be a li-brary of spectral reflectance signatures associated withtargets, or less explicit information such as a reflec-tance signature extracted from within the scene. Forexample, anomaly detection could be used to isolatespecific materials in a scene, and subsequent passes of

• SHAW AND BURKESpectral Imaging for Remote Sensing

16 LINCOLN LABORATORY JOURNAL VOLUME 14, NUMBER 1, 2003

an imaging sensor might be used to search for similarmaterials as well as to verify that the previously de-tected objects or materials have not moved. With aspectral library, materials might be identified and as-sociated with specific types of targets, thus providinga form of target recognition.

Background Characterization

The first two application categories listed in Figure 10emphasize detection and identification of spatiallyisolated features in an image, in the form of anomaliesor targets. The third category emphasizes overallbackground scene analysis and identification, andspans the domains of land, ocean, and atmosphere.Because the Landsat imaging program has operatedfor over thirty years, much of the spectral image datacollected to date has been oriented toward land char-acterization, employing multispectral sensors withGSD values of thirty meters or more.

An example of background scene characterizationis coastal characterization, including shallow-waterbathymetry. The sidebar entitled “Hyperspectral ver-sus Multispectral Remote Sensing for Coastal Char-acterization” illustrates the evolution of multispectralsensors for this application and shows examples of theinformation products that can be derived from multi-spectral and hyperspectral data.

Spectral Processing

As implied in the previous section, the number andvariety of applications for hyperspectral remote sens-ing are potentially quite large. However, the majorityof algorithms used in these applications can be orga-nized according to the following primitive applica-tion-specific tasks: (a) searching the pixels of a hyper-spectral data cube for rare spectral signatures(anomaly detection or target detection); (b) finding thesignificant (i.e., important to the user) changes be-tween two hyperspectral scenes of the same geo-graphic region (change detection); (c) assigning a labelor class to each pixel of a hyperspectral data cube(classification); and (d) estimating the fraction of thepixel area occupied by each material present in thepixel (unmixing). Note that from a signal processingperspective, task c (labeling pixels) is a classificationproblem, whereas task d (determining the constituentelements of a pixel) is an estimation problem.

Preliminary Processing to Reduce Dimensionality

Raw data from the sensor must usually undergo a se-ries of calibration and correction operations to com-pensate for artifacts and gain variations in the sensor.The result is a calibrated radiance cube, which may beprocessed directly, or it may have atmospheric com-

Targetrecognition

Anomalydetection

Cueing

Changedetection

Statusmonitoring

RecognitionLand

surface

Backgroundcharacterization

Ocean AtmosphereDetectionNaturalfeatures

Man-madeobjects

Hyperspectral Imaging Applications

Classification

Materialidentification

Statuscharacterization

Terraincategorization

Trafficability

Targetcharacterization

Coastalbathymetry

Waterclarity

Underwaterhazards

Water vaporand aerosols

Cloud/plumedifferentiation

Atmosphericcompensation

Stressedvegetation

Wetground

Naturaleffluents

Potentialtargets

Search andrescue

Man-madeeffluents

FIGURE 10. Simplified taxonomy of applications for hyperspectral imaging. The three major application categories are anomalydetection, target recognition, and background characterization. Anomaly detection divides pixels into man-made objects ornatural features, target recognition provides detection parameters along with classification parameters of potential targets, andbackground characterization identifies the condition of natural features associated with land, ocean, or atmosphere.

• SHAW AND BURKESpectral Imaging for Remote Sensing

VOLUME 14, NUMBER 1, 2003 LINCOLN LABORATORY JOURNAL 17

pensation applied to it to produce a reflectance cube.Once a calibrated data cube is produced, dimension-ality reduction of the data prior to application of de-tection or classification algorithms can lead to signifi-cant reductions in overall computational complexity.Reducing data dimensionality also reduces the num-ber of pixels required to obtain accurate estimates ofstatistical parameters, since the number of samples(pixels) required to obtain a statistical estimate with agiven accuracy is generally proportional to somepower of the data dimensionality. The most widelyused algorithm for dimensionality reduction is princi-pal-component analysis (PCA), which is the discreteanalog to the Karhunen-Loève transformation forcontinuous signals. The PCA algorithm is describedin more detail in the article in this issue entitled “En-hancing Hyperspectral Imaging System Performancewith Sensor Fusion,” by Su May Hsu and Hsiao-huaK. Burke.

Figure 11 is a simplified block diagram of the spec-tral processing chain, starting with a calibrated radi-ance cube, and illustrating the common elements ofatmospheric compensation and dimensionality re-duction. Subsequent specialized processing dependson the intended application. Figure 11 illustrates two

of many possible applications, unmixing and detec-tion, each of which is discussed briefly in the follow-ing sections and developed more fully in companionarticles in this issue.

Classification versus Detection

Formally, classification is the process of assigning a la-bel to an observation (usually a vector of numericalvalues), whereas detection is the process of identifyingthe existence or occurrence of a condition. In thissense, detection can be considered as a two-class clas-sification problem: target exists or target does not ex-ist. Traditional classifiers assign one and only one la-bel to each pixel, producing what is known as athematic map. This process is called hard classifica-tion. The need to deal more effectively with pixelscontaining a mixture of different materials, however,leads to the concept of soft classification of pixels. Asoft classifier can assign to each pixel multiple labels,with each label accompanied by a number that can beinterpreted as the likelihood of that label being cor-rect or, more generally, as the proportion of the mate-rial within the pixel.

In terms of data products, the goal of target-detec-tion algorithms is to generate target maps at a con-

ReflectanceRadiance Atmosphericcompensation

Data/dimensionreduction

End-user applications

j < K bandsN pixels

K bands

Detection Unmixing

FIGURE 11. Simplified spectral processing diagram. The spectral processing typically starts with calibra-tion of the data cube, followed by atmospheric compensation and dimensionality reduction. Subsequentspecialized processing is determined by the intended end-user application, such as unmixing or detection.

• SHAW AND BURKESpectral Imaging for Remote Sensing

18 LINCOLN LABORATORY JOURNAL VOLUME 14, NUMBER 1, 2003

FIGURE A. Spaceborne multispectral ocean sensors from 1976 to the present,showing the numbers and locations of spectral bands. The variety of bandsindicates the ocean characteristics the sensors are designed to investigate.(Figure courtesy of the International Ocean Colour Coordinating Group, takenfrom IOCCG Report no. 1, 1998.)

H Y P E R S P E C T R A L V E R S U S M U L T I S P E C T R A LR E M O T E S E N S I N G F O R C O A S T A L

C H A R A C T E R I Z A T I O N

350 400 450 500 550 600 650 700 750 800 850 900 950CZCS

OCTS

POLDER

MOS

SeaW iFS

MODIS

MERIS

443 670550520 750

412 443 490 520 565 670 765 865

443 490 565 670 763765 865 910

408 443 485 520 570 615 650 685 750 815 870

412 443 490510 555 670 765 865

412 443 488 531551 678

667 748 870

870779709681

665620560510

490442.5

412.5

412 443 490510 555 670 765 865

443 510 555 670 865

443 490510 555 670 865

490

380 412400 443

460490

520545

565 625 666 710 749 865

443 490 565 670 763765 865 910

OCM

OCI

OSMI

GLI

POLDER-2

S-GLI

350 400 450 500 550 600 650 700 750 800 850 900 950

(nm)

MISR867672557446

680

412 443 490 520 565 625 680 710 749 865

λ

two-thirds ofthe earth’s surface. The opticalproperties of ocean water are in-fluenced by many factors, such asphytoplankton, suspended mate-rial, and organic substances.Spectral remote sensing providesa means of routinely monitoringthis part of our planet, and ob-taining information on the statusof the ocean.

Ocean waters are generallypartitioned into open ocean andcoastal waters. Phytoplankton isthe principal agent responsiblefor variations in the optical prop-erties of open ocean. Coastal wa-ters, on the other hand, are influ-enced not just by phytoplanktonand related particles, but also byother substances, most notablyinorganic particles in suspensionand organic dissolved materials.Coastal characterization is thus afar more complex problem foroptical remote sensing.

Figure A illustrates a variety ofmultispectral ocean color sensorsused in the past twenty-five years.The spectral bands of these sen-sors, which lie in the spectralrange between 0.4 and 1.0 µm,were chosen to exploit the reflec-tion, backscatter, absorption, andfluorescence effects of variousocean constituents. These multi-spectral sensors vary in the num-ber of bands and exact band loca-

• SHAW AND BURKESpectral Imaging for Remote Sensing

VOLUME 14, NUMBER 1, 2003 LINCOLN LABORATORY JOURNAL 19

FIGURE B. Distribution of suspended matter (top), chlorophyll concentration(middle), and absorption by colored dissolved organic matter in the North Sea(bottom). These coastal ocean products are representative examples of multi-spectral remote sensing. With hyperspectral sensor data, these products canbe retrieved more accurately because of better atmospheric correction anddecoupling of complex phenomena. (Figure courtesy of the InternationalOcean Colour Coordinating Group, taken from IOCCG Report no. 3, 2000.)

tions. The variety of bands andbandwidths in the sensors of Fig-ure A indicates the lack of con-sensus on what the “best” bandsare. Even though these multi-spectral sensors have routinelyprovided products for analysis ofmaterials in the open ocean, theirsuccess with coastal water charac-terization has been limited be-cause of the limited numbers ofspectral bands. Complex coupledeffects in coastal waters betweenthe atmosphere, the water col-umn, and the coastal bottom arebetter resolved with hyperspec-tral imaging. Physics-based tech-niques and automated feature-extraction approaches associatedwith hyperspectral sensor datagive more information to charac-terize these complex phenomena.

Figure B illustrates some of thesample products over coastal re-gions. Hyperspectral sensors cov-ering the spectral range between0.4 and 1 µm include the neces-sary bands to compare withlegacy multispectral sensors. Hy-perspectral sensors can alsogather new information notavailable from the limited num-bers of bands in legacy systems.

Furthermore, even thoughmost ocean-characterization al-gorithms utilize water-leaving ra-diance, the atmospheric aerosoleffect is most pronounced inshortwave visible regions whereocean color measurements aremade. With contiguous spectralcoverage, atmospheric compen-sation can be done with more ac-curacy and precision.

Suspended particulate material (mg/l)

5 10 15 20 25 30 35 40 45

Chlorophyll ( g/l)2.5 5.0 7.5 10.0

Gelbstoff absorption (m–1)

0.3 0.6 0.9 1.2 1.5 1.8 >2.0

µ

• SHAW AND BURKESpectral Imaging for Remote Sensing

20 LINCOLN LABORATORY JOURNAL VOLUME 14, NUMBER 1, 2003

stant false-alarm rate (CFAR), which is a highly desir-able feature of these algorithms. Change-detection al-gorithms produce a map of significant scene changesthat, for reliable operation, depend upon the exist-ence of a reliable CFAR change-detection threshold.The hard or soft thematic maps produced by theseCFAR algorithms convey information about the ma-terials in a scene, and this information can then beused more effectively for target or change detection.

The thematic-map approach is not feasible for tar-get detection applications because of the lack of train-ing data for the target. At first glance, detection andclassification look deceptively similar, if not identical.However, some fundamental theoretical and practicaldifferences arise because of the rarity of the targetclass, the desired final product (target detection mapsversus thematic maps), and the different cost func-tions (misclassifying pixels in a thematic map is not ascritical as missing a target or overloading a targettracking algorithm with false alarms). CFAR detec-tion algorithms are discussed more fully in the articleentitled “Hyperspectral Image Processing for Auto-matic Target Detection Applications,” by DimitrisManolakis, David Marden, and Gary A. Shaw.

Unmixing

As noted in the discussion of spatial resolution, sincematerials of interest (i.e., targets) may not be fully re-solved in a pixel, there is value in being able to de-compose the spectral signature from each pixel in ascene into the individual collection of material spec-tra comprising each pixel. The development of algo-rithms to extract the constituent spectra comprising apixel, a process called unmixing, has been aggressivelypursued only during the last decade. In contrast todetection and classification, unmixing is an estima-tion problem. Hence it is a more involved process andextracts more information from the data. The articlein this issue entitled “A Survey of Spectral UnmixingAlgorithms,” by Nirmal Keshava, discusses in greaterdepth the issues associated with unmixing.

Scene Illumination

The overall quality of passive imagery, and the successin extracting and identifying spectral signatures frommultispectral or hyperspectral imagery, both depend

heavily on the scene illumination conditions. Asnoted previously, illumination within a scene can varysignificantly because of shadows, atmospheric effects,and sun angle. In fact, at many latitudes, the lack ofadequate solar illumination prohibits reliable spectralreflectance imaging for a large portion of a twenty-four-hour day. An obvious solution to this problem isto provide active controlled illumination of the sceneof interest. Achieving this simple goal, however, re-quires much more sophistication than we might firstimagine.

During peak months, solar illumination reachingthe earth’s surface can be 800 W/m2 or more, concen-trated in the visible region of the spectrum. With theNASA Jet Propulsion Laboratory (JPL) Airborne Vis-ible Infrared Imaging Spectrometer (AVIRIS) hyper-spectral sensor [12] as an example, to artificially cre-ate this level of continuous illumination over the17-m × 11-km swath comprising an AVIRIS scan linewould require on the order of 150 MW of opticalpower, which is clearly impractical. Several steps canbe taken to reduce the required power for illumina-tion. The projection of the instantaneous field of viewof the sensor on the ground can be reduced in area.This reduction can be achieved by narrowing the an-gular field of view of the sensor, or by moving the sen-sor closer to the scene, or a combination of the two.Since the area decreases in proportion to the square ofthe range and the square of the angular field of view,decreasing each by a factor of ten would reduce therequired illumination power by a factor of 10,000.

The intensity of the artificial illumination can alsobe reduced, relative to natural solar illumination, butthis requires the sensor and associated processing tooperate at lower SNRs. A reduction of an order ofmagnitude over peak solar illumination may be pos-sible, depending upon the scene being illuminated.The decrease in illumination intensity could beachieved either by a uniform reduction in opticalpower or by selectively reducing illumination in dis-crete wavebands. For example, with a multispectralsensor, there is no value in illuminating wavebandsthat are outside the sensor’s detection bands. Anotherway to reduce the total illumination energy is to pulsethe illumination in much the same fashion as a flashunit on a camera.

• SHAW AND BURKESpectral Imaging for Remote Sensing

VOLUME 14, NUMBER 1, 2003 LINCOLN LABORATORY JOURNAL 21

Through a combination of reduced standoff range,reduced field of view, lower operating SNR, andpulsed band-selective illumination, it is possible toactively illuminate scenes for multispectral and hy-perspectral sensors. An early example of such a sensorwas the Lincoln Laboratory Multispectral Active/Pas-sive Sensor (MAPS), which included active pulsed la-ser illumination at 0.85 µm and 10.59 µm, as well asan 8-to-12-µm thermal imager [13]. More recentwork, discussed in the article entitled “Active SpectralImaging,” by Melissa L. Nischan, Rose M. Joseph,Justin C. Libby, and John P. Kerekes, extends the ac-tive image concept to the hyperspectral regime by us-ing a novel white-light pulsed laser for illumination.

Spectral Imaging Chronology and Outlook

Interesting parallels exist in the development of radarimaging and spectral imaging, and in hyperspectralimaging in particular. Figure 12 displays milestonesin the development of multispectral and hyperspec-tral airborne and spaceborne sensors. Highlights inthe development of SAR imaging are also included atthe bottom of the figure for comparison. The first op-erational radar system—the Chain Home Radar—commenced operation in Britain in 1937. Less thansixteen years later, in 1953, side-looking SAR radarexperiments were under way in the United States.Work on airborne SAR eventually led to the develop-

FIGURE 12. Timeline highlighting development of three different categories of spectral imaging, along with parallel develop-ments in synthetic aperture radar (SAR) imaging. The uppermost timeline represents the evolution of multispectral sensingfrom airborne experiments through the series of Landsat satellite imagers, culminating in the Advanced Land Imager (ALI) ex-periment flown on NASA’s Earth Observing (EO-1) satellite. The second timeline illustrates that high-spatial-resolution (≤ 30 m)hyperspectral sensing has been implemented on a number of experimental airborne platforms, including the HyperspectralDigital Imagery Collection Experiment (HYDICE) and the Airborne Visible Infrared Imaging Spectrometer (AVIRIS). EO-1 alsocarries a hyperspectral sensor called Hyperion in addition to the multispectral ALI sensor. The third timeline shows that activemultispectral and hyperspectral sensing to date is limited to a few research efforts.

1950 1960 19801970 1990 2000

Synthetic aperture (imaging) radar evolution

Active multispectral/hyperspectral imaging

MAPS (MSI)1984

AHSI test bed1998

M-71970

Landsat-1(ERTS) 1972

Landsat-71999

MTI2000 ALI

2000+

VNIR/SWIR/LWIR multispectral imaging

Hyperion2000+

GMTI/SAR Radar Satellite

2010+

AVIRIS1987

HYDICE1995

VNIR/SWIR hyperspectral imaging

NVIS1998

SLAR1953

ASARS1978

SeaSat1978

ADTS1982

Global Hawk2000

JSTARS1997

2010

EO-1

EO-1

• SHAW AND BURKESpectral Imaging for Remote Sensing

22 LINCOLN LABORATORY JOURNAL VOLUME 14, NUMBER 1, 2003

ment and launch by NASA in 1978 of the experimen-tal spaceborne Seasat SAR. Exactly a quarter of a cen-tury elapsed between the initial demonstration of air-borne side-looking SAR and the launch of thespaceborne Seasat SAR.

In comparison, one of the first successful airbornemultispectral scanning imagers, the EnvironmentalResearch Institute of Michigan (ERIM) M-7, wasdemonstrated in 1963. Twenty-four years later, in1987, the first airborne hyperspectral imager, calledAVIRIS, was commissioned. AVIRIS was the firstearth-looking imaging spectrometer to cover the en-tire solar-reflectance portion of the spectrum in nar-row contiguous spectral channels. Thirteen years afterthe commissioning of AVIRIS, the first spacebornehyperspectral sensor, called Hyperion, was launchedinto orbit on the EO-1. The thirty-four-year time lagbetween the first airborne SAR in 1953 and the firstairborne hyperspectral imager in 1987, and thetwenty-two-year lag between the launch of Seasat andHyperion, suggest that hyperspectral imaging is fol-lowing a development timeline similar to that ofSAR, but is a few decades less mature.

By borrowing from the work on algorithms andthe lessons learned in radar and multispectral sensordevelopment, there is reason to believe that the morethan twenty-year maturity gap between SAR imagingand hyperspectral imaging can be closed quickly. In

comparison to the twenty-five-year gap between thefirst airborne SAR and the launch of a spaceborneSAR, the gap of only thirteen years between AVIRIS,the first airborne hyperspectral sensor, and the space-borne Hyperion is an encouraging sign that hyper-spectral technology is maturing rapidly.

Table 2 summarizes this overview of spectral imag-ing systems, comparing the salient features of threehyperspectral sensors—the Hyperspectral Digital Im-agery Collection Experiment (HYDICE) [14],AVIRIS, and Hyperion [15]—referenced in Figure12. Note that the sensors span a range of altitudes andGSD values. Figure 13 (not drawn to scale) comparesthe ground swath widths of the sensors, illustratingthe trade-off among spatial resolution, ACR, and alti-tude. Figure 13 also shows the swath width of Land-sat-7, which precedes EO-1 in orbit by one minute.The data from these and other sensors afford the re-mote sensing community an opportunity to refineand enhance algorithms and application concepts.

In This Issue

This issue of the Journal contains six articles that dis-cuss areas of spectral image research at Lincoln Labo-ratory. The first article, “Compensation of Hyper-spectral Data for Atmospheric Effects,” by Michael K.Griffin and Hsiao-hua K. Burke, deals with the im-portant issue of atmospheric, or more generally, envi-

Table 2. Comparison of Hyperspectral Imaging Systems

Parameter HYDICE AVIRIS Hyperion

Nominal altitude (km) 1.6 20 705

Swath (km) 0.25 11 7.6

Spatial resolution (m) 0.75 20 30

Spectral coverage (µm) 0.4–2.5 0.4–2.5 0.4–2.5

Spectral resolution (nm) 7–14 10 10

Number of wavebands 210 224 220

Focal-plane pixels (spatial × spectral) 320 × 210 614 × 224 256 × 220

Data-cube size 300 × 320 × 210 512 × 614 × 224 660 × 256 × 220

Cube collection time (sec) ~3 43 3

• SHAW AND BURKESpectral Imaging for Remote Sensing

VOLUME 14, NUMBER 1, 2003 LINCOLN LABORATORY JOURNAL 23

ronmental compensation and characterization. Algo-rithms that seek to detect or identify materials byspectral signature must account for the effects of theenvironment, including solar illumination, atmo-spheric attenuation and scattering, and shadowing.To the user who seeks spectral signatures within animage, the atmospheric artifacts are a nuisance thatmust be removed. To the atmospheric scientist, how-ever, these artifacts provide a wealth of informationabout changes in atmospheric gases and aerosols.

The second article, “A Survey of Spectral Unmix-ing Algorithms,” by Nirmal Keshava, deals with abroad class of algorithms that are central to unravel-ing the information contained in a hyperspectral sig-nal. Since a given pixel in a hyperspectral image canbe comprised of many materials, methods have beendevised to estimate the number as well as the types ofmaterials in a pixel, along with the relative fraction ofthe pixel area that each material occupies. This esti-mation problem is generally referred to as unmixing.

The third article, “Hyperspectral Image Processingfor Automatic Target Detection Applications,” byDimitris Manolakis, David Marden, and Gary A.

Shaw, deals with the automatic detection of spatiallyresolved and spatially unresolved targets in hyperspec-tral data. Relative to the traditional problem of landclassification, the use of hyperspectral imagery for tar-get detection places new demands on processing anddetection algorithms. Targets are generally sparse inthe imagery, with too few target pixels to support sta-tistical methods of classification. In addition, a mis-take in declaring a target when none is present (a falsealarm) can be costly in terms of subsequent actions,or it can simply overwhelm an analyst if too manyfalse targets are declared. This article describes thetheoretical basis for automated target detection andalso characterizes the performance of detectors on realdata, indicating departures from theory.

Collection of hyperspectral imagery to support al-gorithm research, development, and performance as-sessment is a costly proposition, in part due to thesensor operating costs, but also due to the need tocarefully characterize the scenes (i.e., find the groundtruth) in which data are collected. Even when a data-collection campaign runs smoothly, and comprehen-sive ground truth is provided, the result is still a data

FIGURE 13. Altitude and area coverage regimes for HYDICE, AVIRIS, and Hyperion (on EO-1) hyperspectral sensor platforms.The nominal swath width and spatial resolution of each sensor are shown in parentheses. Note the difference in swath widthbetween the mature operational Landsat-7 multispectral sensor (185 km) and the experimental hyperspectral sensors—HYDICE(0.25 km), AVIRIS (11 km), and Hyperion (7.6 km).

Landsat-7 Less than 1 minute

AVIRIS (11 km × 20 m)

Hyperion(7.6 km × 30 m)

Landsat multispectral(185 km × 30 m)

705-kmaltitude

ALI multispectral (37 km × 30 m)

11 km

185 km

HYDICE(0.25 km × 0.75 m)

EO-1

37 km

• SHAW AND BURKESpectral Imaging for Remote Sensing

24 LINCOLN LABORATORY JOURNAL VOLUME 14, NUMBER 1, 2003

set of limited geographical and temporal scope (lim-ited backgrounds, illumination, seasonal variations)with a limited number of targets. Because of the time-consuming and costly nature of data-collection cam-paigns, and the limited scope of the data collected,simulation and modeling play an important role inaugmenting data sets and examining system trade-offs. The fourth article, “Hyperspectral Imaging Sys-tem Modeling,” by John P. Kerekes and Jerrold E.Baum, introduces a statistical model for hyperspectralimaging and describes its use to support a variety ofperformance predictions and performance trade-offs.

Recent advances in technology have opened thepossibility for combining multispectral and hyper-spectral imaging techniques with active illuminationand range gating. The fifth article, “Active SpectralImaging,” by Melissa L. Nischan, Rose M. Joseph,Justin C. Libby, and John P. Kerekes, describes a re-search effort to identify applications for—and quan-tify the performance advantages of—active illumina-tion of scenes in both hyperspectral and multispectralimaging.

Hyperspectral imaging relies primarily on spectralfeatures to detect and identify targets (more precisely,to identify the surface coatings of targets), whereashigh-resolution panchromatic imagery and radar im-agery rely more on spatially invariant features to de-tect and identify targets. Each of these sensing mo-dalities has strengths and weaknesses. An importantarea of research addresses methods for combining orfusing the information from multimodal sensing toimprove the overall detection and identification per-formance against different classes of target. The sixtharticle, “Multisensor Fusion with Hyperspectral Im-aging Data: Detection and Classification,” by Su MayHsu and Hsiao-hua K. Burke, describes several meth-ods for fusing spatial and spectral information fromheterogeneous sensors to increase detection and iden-tification performance, as well as provide more easilyinterpretable fused imagery.

Acknowledgments

The authors would like to thank our colleaguesMichael Griffin, Jerry Baum, and Peter Boettcher fortheir contributions to Figures 2, 4, and 9, respectively,and John Kerekes and Don Lencioni for helpful dis-

cussions regarding the spectral-spatial trade space.Much of the work reported in this special issue onspectral imaging was funded by the Office of theDeputy Undersecretary of Defense (Science andTechnology).

R E F E R E N C E S1. B.D. Seery, D.B. Cramer, C. Stevens, and D. Lencioni, “EO-

1: NASA’s First New Millennium Earth Orbiting Mission,”SPIE 2810, 1996, pp. 4–10.

2. I.J. Allison and E.A. Neil, “Final Report on the TIROS 1 Me-teorological Satellite System,” NASA Technical Report R-131,Goddard Space Flight Center, Greenbelt, Md., 1962.

3. W.E. Easias, M.R. Abbott, I. Barton, O.B. Brown, J.W.Campbell, K.I. Carder, D.K. Clark, R.H. Evans, F.E. Hoge,H.R. Gordon, W.M. Balch, R. Letelier, and P.J. Minnett, “AnOverview of MODIS Capabilities for Ocean Science Observa-tions,” IEEE Trans. Geosci. Remote Sens. 35 (4), 1998, pp.1250–1265.

4. S.L. Ustin, D.A. Roberts, and Q.J. Hart, “Seasonal VegetationPatterns in a California Coastal Savanna Derived from Ad-vanced Visible/Infrared Imaging Spectrometer (AVIRIS)Data,” in Remote Sensing Change Detection: EnvironmentalMonitoring Applications and Methods, C.D. Elvidge and R.S.Lunetta, eds. (Ann Arbor Press, Ann Arbor, Mich., 1998), pp.163–180.

5. T. Peli, M. Young, and K. Ellis, “Multispectral Change Detec-tion,” SPIE 3071, 1997, pp 97–105.

6. C.A. Wiley, “Pulsed Doppler Radar Methods and Apparatus,”U.S. Patent No. 3,196,436, 20 July 1965.

7. S.A. Hovanessian, Introduction to Synthetic Array and ImagingRadars (Artech House, Dedham, Mass., 1980).

8. H.A. Zebker and R.M. Goldstein, “Topographic Mappingfrom Interferometric Synthetic Aperture Radar Observa-tions,” J. Geophys. Res. 91 (B5), 1986, pp. 4993–4999.

9. D. Landgrebe, “Hyperspectral Image Data Analysis,” IEEESignal Process. Mag. 19 (1), 2002, pp. 17–28.

10. D. Landgrebe, “The Evolution of Landsat Data Analysis,”Photogramm. Eng. and Remote Sens. LXIII (7), 1997, pp. 859–867.

11. D.R. Hearn, C.J. Digenis, D.E. Lencioni, J.A. Mendenhall,J.B. Evans, and R.D. Welch, “EO-1 Advanced Land ImagerOverview and Spatial Performance,” Proc. IGARRS 2001 2,Sydney, 9–13 2001, pp. 897–200.

12. G. Vane, “First Results from the Airborne Visible/Infrared Im-aging Spectrometer (AVIRIS),” JPL Publication 87-38, 15Nov. 1987.

13. A.B. Gschwendtner and W.E. Keicher, “Development of Co-herent Laser Radar at Lincoln Laboratory,” Linc. Lab. J. 12 (2),2000, pp. 383–396.

14. L.J. Rickard, R. Basedow, E. Zalewski, P. Silvergate, and M.Landers, “HYDICE: An Airborne System for HyperspectralImaging,” SPIE 1937, 1993, pp. 173–179.

15. J. Pearlman, C. Segal, L. Liao, S. Carman, M. Folkman,B. Browne, L. Ong, and S. Ungar, “Development and Opera-tions of the EO-1 Hyperion Imaging Spectrometer,” SPIE4135, 2000, pp. 243–253.

• SHAW AND BURKESpectral Imaging for Remote Sensing

VOLUME 14, NUMBER 1, 2003 LINCOLN LABORATORY JOURNAL 25

A P P E N D I X : F U N D A M E N T A L S O FA T M O S P H E R I C C O M P E N S A T I O N

, has been devoted tomethods for quantifying and compensating for thedeleterious effects of the atmosphere on spectral im-aging, resulting in a variety of atmospheric compensa-tion methods and models. Most of these methods canbe classified either as scene-based statistical methodsor physics-based modeling methods. The former usea priori knowledge of the reflectance characteristics ofspecific reference objects (such as calibration panels)in a scene to develop statistical relationships betweenthe at-sensor observations and the known surface re-flectance. The empirical line method (ELM) is one ofthe oldest and most commonly used statistical meth-ods for atmospheric compensation [1, 2].

The concept behind the ELM method is simple.Field measurements of the surface reflectance of se-lected reference objects are made on the ground, atclose range, under controlled conditions. These mea-surements represent ground-truth reflectance data.Typically, the reference objects are carefully selectedor constructed to provide relatively constant reflec-tance over the spectral measurement bands of inter-est, and they are positioned in the scene to ensuregood line-of-sight visibility to the airborne or space-based collection platform. During data collection,these reference objects are included in the imagerycollected by the sensor. For each spectral band in thesensor data, a linear regression is performed to relatethe raw or calibrated radiance measured in eachwaveband to the corresponding ground-truth surfacereflectance. The result is a gain-offset correction fac-tor for each spectral band.

Figure 1 illustrates the concept for three differentspectral bands using HYDICE data containing tworeference reflectance objects, one a nominal 4% re-flectance panel and the other a nominal 32% reflec-tance panel. In this figure, the average radiance for allthe pixels in a given panel is plotted against theground-truth reflectance measurement. Note that thegain G and offset Lo vary across the bands. Given a

sensor radiance measurement Ls elsewhere in thescene, the reflectance ρ at that location can be esti-mated, given the gain and offset terms at eachwaveband, according to the simple formula

ρ =−L LG

s o .

To a first-order approximation, the offset term Lo isequal to the upwelling path radiance that is superim-posed on the desired ground-level radiance signal ofinterest. The gain term is proportional to the two-wayatmospheric transmittance modulated by the solarspectral irradiance.

The ELM generally yields good estimates of thescene’s surface reflectances, provided the reflectancesof the reference objects are accurate and spectrallyuniform. However, a deficiency of the ELM is the as-sumption that the atmospheric conditions observedover the reference objects are the same throughoutother parts of the scene. In fact, information pertain-

Ref

lect

ance

0

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

0 10 20 30 40 50 60 70

Radiance (Wm–2 m–1sr–1)µ

32% reflectance panel data points

1320 nm792 nm

656 nm

4% reflectance panel data points

FIGURE 1. Empirical calibration lines for a HYDICE data setfrom reference reflectance panels positioned in the scene.Three arbitrarily chosen wavelengths illustrate the gain andoffset corrections for these bands. The data points definingthe lines correspond to two reference panels of 4% and 32%reflectance.

• SHAW AND BURKESpectral Imaging for Remote Sensing

26 LINCOLN LABORATORY JOURNAL VOLUME 14, NUMBER 1, 2003

ing to the intervening atmosphere is not derived withthe ELM approach and, consequently, ELM can notaccount for any variation in the atmosphere across thescene. For example, topographic variations in the at-mospheric path result in elevation-dependent residualatmospheric absorptions. An extreme example, butone that is familiar to all, is pockets of fog in low-ly-ing areas, while adjacent areas, at slightly higher eleva-tions, are fog free.

Figure 2 illustrates the ELM gain and offset correc-tion terms, as a function of wavelength, computed fora HYDICE Forest Radiance-I data collection. Notethe similarity in shape between the gain correctionterm of Figure 2(a) and the solar spectral-radiancecurve of Figure 2 in the main article. The ELM-de-rived reflectance obtained by applying the gain and

offset correction to the radiance data measured forone of the reference panels is compared to theground-truth reflectance in Figure 3. Since the ELMgain and offset corrections are derived in part frommeasurement of the panel, it is not surprising that thefit is good.

When reference panels can’t be prepositioned in ascene of interest, methods have been developed to es-timate the gain and offset correction terms by usingcertain naturally occurring objects in the scene. Forexample, smooth bodies of water usually exhibit verylow reflectance, so the radiance measured over suchdark objects can be attributed to the upwelling pathradiance term Lo. However, dependence upon thepresence of suitable reference objects in a scene toperform atmospheric compensation is often limiting.Even if such objects can be found, the implied as-sumption of atmospheric homogeneity across thescene is often violated. For these reasons, physics-based models have been developed to provide atmo-spheric compensation even when reference objects ofknown reflectance are not available.

Two of the more commonly used physics-basedmodels are the atmospheric removal (ATREM) algo-rithm [3] and fast line-of-sight atmospheric analysisof spectral hypercubes (FLAASH) [4]. WhileATREM and FLAASH differ in the details of theirapproach, both include the key step of using band-ra-tio techniques [5, 6] to quantify the effects of water

0

40

80

120

160

200

0.4 0.9 1.4 1.9 2.4

ELM

gai

n

Wavelength ( m)µ

0

2

4

6

8

10

12

–2

0.4 0.9 1.4 1.9 2.4

ELM

off

set

Wavelength ( m)µ

(a)

(b)

FIGURE 2. Empirical line method (ELM) corrections for (a)gain and (b) offset computed from reference panel reflect-ances and HYDICE calibrated radiance data.

–0.05

0

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

0.4 0.9 1.4 1.9 2.4

Ref

lect

ance

Wavelength ( m)µ

Ground-truth reflectance ELM-derived reflectance

FIGURE 3. ELM-derived reflectance obtained from HYDICEradiance measurements for a 32% reflectance referencepanel.

• SHAW AND BURKESpectral Imaging for Remote Sensing

VOLUME 14, NUMBER 1, 2003 LINCOLN LABORATORY JOURNAL 27

vapor on the hyperspectral measurements. A detaileddiscussion and comparison of these physics-based at-mospheric compensation techniques is beyond thescope of this article, but quantitative evaluations ofperformance can be found elsewhere [7, 8], includingthe article in this issue, “Compensation of Hyper-spectral Data for Atmospheric Effects,” by Michael K.Griffin and Hsiao-hua K. Burke.

In brief, the band-ratio technique involves com-paring ratios of radiance measurements made near theedges of known atmospheric water-vapor absorptionbands in order to estimate the column water vapor inthe atmosphere on a pixel-by-pixel basis. Look-uptables (LUT), indexed by measured quantities in thescene, combined with other specified informationsuch as the solar zenith angle, provide the informa-tion necessary to estimate reflectance across the scene,without resorting to reference objects within thescene. Other information in addition to reflectancecan also be derived from the physics-based models,including estimates of terrain height and aerosol opti-cal depth (visibility). Figure 4 shows a simplifiedblock diagram of the FLAASH processing flow.

R E F E R E N C E S1. D.D. Roberts, Y. Yamaguchi, and R.P. Lyon, “Calibration of

Airborne Imaging Spectrometer Data to Percent ReflectanceUsing Field Spectral Measurements,” Proc. Nineteenth Int.

Symp. on Remote Sensing of Environment 2, Ann Arbor, Mich.,21–25 Oct. 1985, pp. 679–688.

2. J.E. Conel, R.O. Green, G. Vane, C.J. Bruegge, R.E. Alley,and B.J. Curtiss, “Airborne Imaging Spectrometer-2: Radio-metric Spectral Characteristics and Comparison of Ways toCompensate for the Atmosphere,” SPIE 834, 1987, pp. 140–157.

3. B.-C. Gao, K.B. Heidebrecht, and A.F.H. Goetz, “Derivationof Scaled Surface Reflectances from AVIRIS Data,” Rem. Sens.Environ. 44 (2/3), 1993, pp. 165–178.

4. S.M. Adler-Golden, A. Berk, L.S. Bernstein, S. Richtsmeier,P.K. Acharya, M.W. Matthew, G.P. Anderson, C. Allred, L.Jeong, and J. Chetwynd, “FLAASH, A MODTRAN4 Atmo-spheric Correction Package for Hyperspectral Data Retrievalsand Simulations,” Proc. 7th Ann. JPL Airborne Earth ScienceWorkshop, Pasadena, Calif., 1998, JPL Publication 97-21, pp.9–14.

5. B.-C. Gao and A.F.H. Goetz, “Column Atmospheric WaterVapor and Vegetation Liquid Water Retrievals from AirborneImaging Spectrometer Data,” J. Geophys. Res. 95 (D4), 1990,pp. 3549–3564.

6. B.-C. Gao and Y.J. Kaufman, “The MODIS Near-IR WaterVapor Algorithm, Algorithm Technical Background Doc. ID:MOD05: Total Precipitable Water,” NASA Goddard SpaceFlight Center, Greenbelt, Md., 1997.

7. M.W. Matthew, S.M Adler-Golden, A. Berk, G. Felde, G.P.Anderson, D. Gorodetzkey, S. Paswaters, and M. Shippert,“Atmospheric Correction of Spectral Imagery: Evaluation ofthe FLAASH Algorithm with AVIRIS Data,” SPIE 5093,2003.

8. M.K. Griffin, H. K. Burke, J. Vail, S.M. Adler-Golden, and M.Matthew, “Sensitivity of Atmospheric Compensation ModelRetrievals to Input Parameter Specification,” Proc. AVIRISEarth Science and Applications Workshop, Pasadena, Calif.,1999, JPL Publication 99-17.

Water-vapor absorption band (e.g., 0.94 m and/or 1.14 m)

Input:solar sensor angle,sensor information,

and other a priorimeasurements

Radiance values

Imageradiance

cube Spatialsmoothing

foradjacencycorrection

Surfacereflectance

cube

Integrated water vapor

Aerosoloptical depthTerrain

height

µµ

Absorption band of well-mixed gas(e.g., O2 at 0.76 m)µ

Dark or known pixel (e.g., 0.45 m or 0.65 m)µµ

Look-up table

Look-up table

Look-up table

Look-up table

FIGURE 4. Schematic flow of the fast line-of-sight atmospheric analysis of spectral hypercubes (FLAASH) code.

• SHAW AND BURKESpectral Imaging for Remote Sensing

28 LINCOLN LABORATORY JOURNAL VOLUME 14, NUMBER 1, 2003

. is a senior staff member in theAdvanced Space Systems andConcepts group. His currentresearch focus is on collabora-tive sensing concepts, includ-ing energy-efficient techniquesfor both line-of-sight and non-line-of-sight electro-opticalcommunication and network-ing. He joined Lincoln Labo-ratory in 1980 to work onadaptive processing for radarsystems, and was a member ofthe study team that launched aprogram in the early 1980s todevelop concepts for afford-able space radar. From 1984 to1987 he served as test directorand assistant leader of theALCOR and MMW radars atthe Kwajalein Reentry Mea-surements Site. After hisKwajalein tour he served asassistant group leader and thenassociate group leader in whatis now the Sensor Technologyand System ApplicationsGroup, where he applied hissignal processing expertise toradar and passive sensor algo-rithm development. He re-ceived B.S. and M.S. degreesin electrical engineering fromthe University of SouthFlorida, and a Ph.D. degree inelectrical engineering from theGeorgia Institute of Technol-ogy, where he was both aPresident’s Fellow and aSchlumberger Fellow.

- . is the leader of the SensorTechnology and System Appli-cations group. In the area ofremote sensing, she specializesin applications of opticalsensors, particularly dataanalysis and algorithm devel-opment. A recent area ofinterest is the application ofhyperspectral technology tothe Department of Defense,including fusion of hyperspec-tral data information withother sensors. She received aPh.D. degree in atmosphericphysics from Rice University,and joined Lincoln Laboratoryas a staff member in 1981.