multispectral remote sensing multispec program from purduemultispec on information extraction...

13
Multispectral Remote Sensing • Multispec program from Purdue On Information Extraction Principles for Hyperspectral Data, David Landgrebe, School of Electrical and Computer Engineering, Purdue University.

Post on 21-Dec-2015

220 views

Category:

Documents


0 download

TRANSCRIPT

Multispectral Remote Sensing

• Multispec program from Purdue

• On Information Extraction Principles for Hyperspectral Data, David Landgrebe, School of Electrical and Computer Engineering, Purdue University.

Spaces

• Image space– rendering of data from (usually) up to three of

the many sensors

• Spectral space– Function space with elements being spectra or

linear combinations of spectra

• Feature space– representation, representation, representation

Image Space

• http://edcwww.cr.usgs.gov/Webglis/glisbin/guide.pl/glis/hyper/guide/napp

Spectral Space (Theory)RoadVegetationWater

Ref

lec t

anc e

Wavelength

Spectral Space (Reality)RoadVegetationWaterUnknown

Ref

lec t

anc e

Wavelength 21

Feature SpaceRoadVegetationWater Unknown

Sen

s or

resp

onse

at

1

Sensor response at 2

Feature Space

• Choosing features can be hard.

• Start with native data representation, for example:– sensor responses at band centers– mean sensor response in band– other weighted averages– sensor response at specified wavelengths

Feature SpaceRoadVegetationWater

Sen

s or

resp

onse

at

1

Sensor response at 2

ClassifyingRoadVegetation

Sen

s or

resp

onse

at

1

Sensor response at 2

Nearest mean may be wrong

Second order better (Gaussian ML decision boundary)

Statistical Moments

Nf = num features, Ns = num samples

Samples x(i,j), i=1…Ns, j=1...Nf from a class C

CC(x) = mean of x(: , j)

C = CovC(r,s) =

mean{ (x(:,r)- C(r)) (x(:,s)- i(s))}

These are the sample moments. Training = finding enough samples to make these be good estimates of the true moments

Quadratic (Gaussian) Classifier

• x an element of an unknown class (e.g. a pixel of unknown classification). It’s a col. vector of size Nf. What class is it in?

• gC(x) = -(1/2)ln(|C|) - (1/2)(x-C)tC-1(x-C)

• Choose C if gC(x) >= gD(x) for all other classes D

• If all the classes are Gaussian, this is a maximum likelihood classifier. Among all possible classifiers it minimizes a reasonable error measure.

Improving the feature space

• Want to – minimize dimension– find a basis of uncorrelated features– simplify calculations

• Eigenvalue analysis; anova; KL-decompositon, ...