1 remote sensing and image processing: 6 dr. mathias (mat) disney ucl geography office: 301, 3rd...

30
1 Remote Sensing and Image Processing: 6 Dr. Mathias (Mat) Disney UCL Geography Office: 301, 3rd Floor, Chandler House Tel: 7670 4290 Email: [email protected] www.geog.ucl.ac.uk/~mdisney

Upload: lorena-reed

Post on 12-Jan-2016

217 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 1 Remote Sensing and Image Processing: 6 Dr. Mathias (Mat) Disney UCL Geography Office: 301, 3rd Floor, Chandler House Tel: 7670 4290 Email: mdisney@geog.ucl.ac.uk

1

Remote Sensing and Image Processing: 6

Dr. Mathias (Mat) Disney

UCL Geography

Office: 301, 3rd Floor, Chandler House

Tel: 7670 4290

Email: [email protected]

www.geog.ucl.ac.uk/~mdisney

Page 2: 1 Remote Sensing and Image Processing: 6 Dr. Mathias (Mat) Disney UCL Geography Office: 301, 3rd Floor, Chandler House Tel: 7670 4290 Email: mdisney@geog.ucl.ac.uk

2

Image processing: information extraction - spatial filtering and

classification

Purpose

• To extract useful information from EO data

• Already seen image enhancement and spectral arithmetic

• Today, spatial filtering and classification

Page 3: 1 Remote Sensing and Image Processing: 6 Dr. Mathias (Mat) Disney UCL Geography Office: 301, 3rd Floor, Chandler House Tel: 7670 4290 Email: mdisney@geog.ucl.ac.uk

3

Spatial filtering

• Spatial information– Things close together more alike than things further apart (spatial

auto-correlation)

– Many features of interest have spatial structure such as edges, shapes, patterns (roads, rivers, coastlines, irrigation patterns etc. etc.)

• Spatial filters divided into two broad categories– Feature detection e.g. edges

– Image enhancement e.g. smoothing “speckly” data e.g. RADAR

Page 4: 1 Remote Sensing and Image Processing: 6 Dr. Mathias (Mat) Disney UCL Geography Office: 301, 3rd Floor, Chandler House Tel: 7670 4290 Email: mdisney@geog.ucl.ac.uk

4

Low/high frequency

DN DN

Gradual change = low frequency Rapid change = high frequency

Page 5: 1 Remote Sensing and Image Processing: 6 Dr. Mathias (Mat) Disney UCL Geography Office: 301, 3rd Floor, Chandler House Tel: 7670 4290 Email: mdisney@geog.ucl.ac.uk

5

How do we exploit this?• Spatial filters highlight or suppress specific features

based on spatial frequency– Related to texture – rapid changes of DN value = “rough”,

slow changes (or none) = “smooth”

4943 48 49 51

5043 65 54 51

1412 9 9 10

4943 48 49 51

225210 199 188 189

Smooth(ish)

Rough(ish)Darker, horizontal linear feature

Bright, horizontal linear feature

Page 6: 1 Remote Sensing and Image Processing: 6 Dr. Mathias (Mat) Disney UCL Geography Office: 301, 3rd Floor, Chandler House Tel: 7670 4290 Email: mdisney@geog.ucl.ac.uk

6

Convolution (spatial) filtering• Construct a “kernel” window (3x3, 5x5, 7x7 etc.) to

enhances/remove these spatial feature

• Compute weighted average of pixels in moving window, and assigning that average value to centre pixel.

• choice of weights determines how filter affects image

Page 7: 1 Remote Sensing and Image Processing: 6 Dr. Mathias (Mat) Disney UCL Geography Office: 301, 3rd Floor, Chandler House Tel: 7670 4290 Email: mdisney@geog.ucl.ac.uk

7

Convolution (spatial) filtering• Filter moves over all pixels in input, calculate value of central

pixel each time e.g.

4943 48 49 51

5043 65 54 51

1412 9 9 10

4943 48 49 51

225210 199 188 189

1/9 1/9 1/9

1/9 1/9 1/9

1/9 1/9 1/9

Input image

filter

??

Output image

?? ??

Page 8: 1 Remote Sensing and Image Processing: 6 Dr. Mathias (Mat) Disney UCL Geography Office: 301, 3rd Floor, Chandler House Tel: 7670 4290 Email: mdisney@geog.ucl.ac.uk

8

Convolution (spatial) filtering• For first pixel in output image

– Output DN = 1/9*43 + 1/9*49 + 1/9*48 + 1/9*43 + 1/9*50 + 1/9*65 + 1/9*12 + 1/9*14 + 1/9*9 = 37

– Then move filter one place to right (blue square) and do same again so output DN = 1/9*(49+48+49+50+65+54+14+9+9) = 38.6

– And again….. DN = 1/9*(48+49+51+65+54+51+9+9+10) = 38.4

• This is mean filter

• Acts to “smooth” or blur image37

Output image

38.6 38.4

Page 9: 1 Remote Sensing and Image Processing: 6 Dr. Mathias (Mat) Disney UCL Geography Office: 301, 3rd Floor, Chandler House Tel: 7670 4290 Email: mdisney@geog.ucl.ac.uk

9

Convolution (spatial) filtering

• Mean filter known as low-pass filter i.e. allows low frequency information to pass through but smooths out higher frequency (rapidly changing DN values)– Used to remove high frequency “speckle” from data

• Opposite is high-pass filter– Used to enhance high frequency information such as lines and point

features while getting rid of low frequency information

High pass

Page 10: 1 Remote Sensing and Image Processing: 6 Dr. Mathias (Mat) Disney UCL Geography Office: 301, 3rd Floor, Chandler House Tel: 7670 4290 Email: mdisney@geog.ucl.ac.uk

10

Convolution (spatial) filtering• Can also have directional filters

– Used to enhance edge information in a given direction

– Special case of high-pass filter

Vertical edge enhancement filter

Horizontal edge enhancement filter

Page 11: 1 Remote Sensing and Image Processing: 6 Dr. Mathias (Mat) Disney UCL Geography Office: 301, 3rd Floor, Chandler House Tel: 7670 4290 Email: mdisney@geog.ucl.ac.uk

11

Practical• Try out various filters of various sizes• See what effect each has, and construct your own

filters– High-pass filters used for edge detection

• Often used in machine vision applications (e.g. robotics and/or industrial applications)

– Directional high-pass filters used to detect edges of specific orientation

– Low-pass filters used to suppress high freq. information e.g. to remove “speckle”

Page 12: 1 Remote Sensing and Image Processing: 6 Dr. Mathias (Mat) Disney UCL Geography Office: 301, 3rd Floor, Chandler House Tel: 7670 4290 Email: mdisney@geog.ucl.ac.uk

12

Example: low-pass filter

•ERS 1 RADAR image, Norfolk, 18/4/97

•Original (left) and low-pass “smoothed” (right)

Page 13: 1 Remote Sensing and Image Processing: 6 Dr. Mathias (Mat) Disney UCL Geography Office: 301, 3rd Floor, Chandler House Tel: 7670 4290 Email: mdisney@geog.ucl.ac.uk

13

Example: high-pass edge detection

•SPOT image, Norfolk, 18/4/97

•Original (left) and directional high-pass filter (edge detection), right

Page 14: 1 Remote Sensing and Image Processing: 6 Dr. Mathias (Mat) Disney UCL Geography Office: 301, 3rd Floor, Chandler House Tel: 7670 4290 Email: mdisney@geog.ucl.ac.uk

14

Multispectral image classification

• Very widely used method of extracting thematic information

• Use multispectral information

• Separate different land cover classes based on spectral response

• i.e. separability in “feature space”

Page 15: 1 Remote Sensing and Image Processing: 6 Dr. Mathias (Mat) Disney UCL Geography Office: 301, 3rd Floor, Chandler House Tel: 7670 4290 Email: mdisney@geog.ucl.ac.uk

15

Feature space• Use different spectral response of different

materials to separate• E.g. plot red against NIR DN values….

Page 16: 1 Remote Sensing and Image Processing: 6 Dr. Mathias (Mat) Disney UCL Geography Office: 301, 3rd Floor, Chandler House Tel: 7670 4290 Email: mdisney@geog.ucl.ac.uk

16

Supervised classification• Choose examples of homogeneous

regions of given cover types (training regions)

• Training regions contain representative DN values - allow us to separate classes in feature space (we hope!)

• Go through all pixels in data set and put each one into the most appropriate class

• How do we choose which is most appropriate?

Page 17: 1 Remote Sensing and Image Processing: 6 Dr. Mathias (Mat) Disney UCL Geography Office: 301, 3rd Floor, Chandler House Tel: 7670 4290 Email: mdisney@geog.ucl.ac.uk

17

Supervised classification

• Figures from Lillesand, Kiefer and Chipman (2004)

Page 18: 1 Remote Sensing and Image Processing: 6 Dr. Mathias (Mat) Disney UCL Geography Office: 301, 3rd Floor, Chandler House Tel: 7670 4290 Email: mdisney@geog.ucl.ac.uk

18

Supervised classification

• Feature space clusters

• E.g. 2 channels of information

• Are all clusters separate?

Page 19: 1 Remote Sensing and Image Processing: 6 Dr. Mathias (Mat) Disney UCL Geography Office: 301, 3rd Floor, Chandler House Tel: 7670 4290 Email: mdisney@geog.ucl.ac.uk

19

Supervised classification

• How do we decide which cluster a given pixel is in?

• E.g. 1 Minimum distance to means classification

• Simple and quick BUT what about points 1 and 2?

Page 20: 1 Remote Sensing and Image Processing: 6 Dr. Mathias (Mat) Disney UCL Geography Office: 301, 3rd Floor, Chandler House Tel: 7670 4290 Email: mdisney@geog.ucl.ac.uk

20

Supervised classification

• E.g. 2 Parallelepiped classification

• We calculate mimumum and maximum of each training class in each band (rectangles)

• BUT what about when a class is large and overlaps another?

Page 21: 1 Remote Sensing and Image Processing: 6 Dr. Mathias (Mat) Disney UCL Geography Office: 301, 3rd Floor, Chandler House Tel: 7670 4290 Email: mdisney@geog.ucl.ac.uk

21

Supervised classification

• E.g. 3 paraellepipeds with stepped decision boundaries

• Gives more flexibility

• Examples are all 2D BUT we can extend any of these ideas into any number of dimensions given more spectral channels

• With 3 channels squares become cubes etc….

Page 22: 1 Remote Sensing and Image Processing: 6 Dr. Mathias (Mat) Disney UCL Geography Office: 301, 3rd Floor, Chandler House Tel: 7670 4290 Email: mdisney@geog.ucl.ac.uk

22

Supervised classification• E.g. 4 Maximum

likelihood• Now we use

probability rather than distance in feature space

• Calculate probability of membership of each class for each pixel

• Results in “probability contours”

Page 23: 1 Remote Sensing and Image Processing: 6 Dr. Mathias (Mat) Disney UCL Geography Office: 301, 3rd Floor, Chandler House Tel: 7670 4290 Email: mdisney@geog.ucl.ac.uk

23

Supervised classification

• Now we see pixel 1 is correctly assigned to corn class

• Much more sophisticated BUT is computationally expensive compared to distance methods

Page 24: 1 Remote Sensing and Image Processing: 6 Dr. Mathias (Mat) Disney UCL Geography Office: 301, 3rd Floor, Chandler House Tel: 7670 4290 Email: mdisney@geog.ucl.ac.uk

24

Unsupervised classification

• As would be expected – minimal user intervention

• We tell it how many classes we expect, then some iterative preocedure determines where clusters are in feature space

• Assigns each pixel in turn to a cluster, then updates cluster statistics and repeats…..– advantage is automatic

– disadvantage is we don’t know what classes represent

Page 25: 1 Remote Sensing and Image Processing: 6 Dr. Mathias (Mat) Disney UCL Geography Office: 301, 3rd Floor, Chandler House Tel: 7670 4290 Email: mdisney@geog.ucl.ac.uk

25

Unsupervised classification• E.g. ISODATA (Iterative self-organising data analysis)

algorithm– Start with (user-defined number) randomly located clusters– Assign each pixel to nearest cluster (mean spectral distance)– Re-calculate cluster means and standard deviations– If distance between two clusters lower than some threshold, merge

them– If standard deviation in any one dimension greater than some

threshold, split into two clusters– Delete clusters with small number of pixels– Re-assign pixels, re-calculate cluster statistics etc. until changes of

clusters smaller than some fixed threshold

Page 26: 1 Remote Sensing and Image Processing: 6 Dr. Mathias (Mat) Disney UCL Geography Office: 301, 3rd Floor, Chandler House Tel: 7670 4290 Email: mdisney@geog.ucl.ac.uk

26

Example: 2 classes, 2 bands

DN Ch 1

DN Ch 2

Initial cluster means

Pixel 1

a

b

Pixel 2

DN Ch 1

DN Ch 2

All pixels assigned to a or b - update stats

New positions of cluster means

Assign pixel 1 to cluster a, 2 to b etc.

DN Ch 1

DN Ch 2

b

a

Cluster means move towards pixels 1 and 2 respectively

Pixel 1

Pixel 2

DN Ch 2

Split a into 2, recalculate. Repeat….

DN Ch 1

New positions of cluster means

SD of cluster a too large?

Page 27: 1 Remote Sensing and Image Processing: 6 Dr. Mathias (Mat) Disney UCL Geography Office: 301, 3rd Floor, Chandler House Tel: 7670 4290 Email: mdisney@geog.ucl.ac.uk

27

Classification Accuracy

• How do we tell if classification is any good?– Classification error matrix (aka confusion matrix or contingency

table)

– Need “truth” data – sample pixels of known classes • How many pixels of KNOWN class X are incorrectly classified as

anything other than X (errors of omission)?– Divide correctly classified pixels in each class of truth data by COLUMN

totals (Producer’s Accuracy)

• How many pixels are incorrectly classified as class X when they should be some other known class (errors of commission)?

– Divide correctly classified pixels in each class by ROW totals (User’s Accuracy)

Page 28: 1 Remote Sensing and Image Processing: 6 Dr. Mathias (Mat) Disney UCL Geography Office: 301, 3rd Floor, Chandler House Tel: 7670 4290 Email: mdisney@geog.ucl.ac.uk

28

Classification Accuracy

Errors of omission for

class UErrors of

comission for class U

Page 29: 1 Remote Sensing and Image Processing: 6 Dr. Mathias (Mat) Disney UCL Geography Office: 301, 3rd Floor, Chandler House Tel: 7670 4290 Email: mdisney@geog.ucl.ac.uk

29

Post processing?• Ideally, all classes would be homogeneous• In reality we get wheat pixels misclassified as “peas”

etc. (accuracy < 100%)• Pass majority filter over classified image

– E.g. 3 x 3 majority filter assigns output pixel value to be majority of nearest 8 pixels

Majority filter

Page 30: 1 Remote Sensing and Image Processing: 6 Dr. Mathias (Mat) Disney UCL Geography Office: 301, 3rd Floor, Chandler House Tel: 7670 4290 Email: mdisney@geog.ucl.ac.uk

30

Assessed practical• Classify Churn Farm airborne image based on information

given to you

• Try supervised methods first– Accuracy? Test using “truth” data provided

– Anything << 80% is not much use….

– What do your training data classes look like in feature space?

• Then perhaps unsupervised– Accuracy?

• Warning: don’t include the white text in any of your training data – the pixel values (255 in all channels) willl TOTALLY screw up your classes!