sar and lidar data fusion: project presentation

15
Fusion of Synthetic Aperture Radar and lidar data for mapping of semi-natural areas ©Alessandro Coppola, 2015 1

Upload: alessandro-coppola

Post on 22-Jan-2018

380 views

Category:

Engineering


2 download

TRANSCRIPT

Page 1: SAR and LIDAR data fusion: project presentation

Fusion of Synthetic Aperture Radar and lidar data for mapping of semi-natural areas

©Alessandro Coppola, 2015

1

Page 2: SAR and LIDAR data fusion: project presentation

Purposes of research

• Develop an algorithm to create an enhanced classification map for semi-natural areas, based on the fusion of SAR and lidar datasets

• Apply the general technique to the Maspalomas area, Gran Canaria (case of study)

• Validate the algorithm: accuracy assessment

2

Page 3: SAR and LIDAR data fusion: project presentation

SAR 3Lidar

• SAR• Active microwave remote

sensing technology• Grey scale image, depending on

the microwave backscattering

• Lidar• Active Near-Infrared

remote sensing technology• Point cloud: measurement

of 𝑥, 𝑦, 𝑧 coordinates of the reflective target

SAR imageAerial photo (©Google)

Lidar point cloud

𝑑 =𝜏 ∙ 𝑐

2

HEI

GH

T

REF

LEC

TIV

ITY

Page 4: SAR and LIDAR data fusion: project presentation

Priority classes algorithm 4

Mask 1 Mask 2 Mask 3OFF

ONONON

OFFOFF

CLASS 1 CLASS N+1CLASS 3CLASS 2

MAP

Mask N

CLASS N

ON

OFF

Image 1

Image 2

Feature extraction

Feature level fusion

Binary Segmentation

Supervised labelling

Feature extraction CLASS

Classification

Mask

ON

OFF

• Single classes are detected with masks, in order of priority

• Every class is masked off in the following steps

• The single-class images are combined to obtain the final classification map

• Masks from features• Classification with

scattering models

Page 5: SAR and LIDAR data fusion: project presentation

Study area and datasets

Project ARTeMISat

• Protected and ecologically sensitive area threatened by human presence

• Maspalomas Natural Reserve (Gran Canaria)

Maspalomas area (©2015 GRAFCAN) – Ground truth©

SAR data

Processing level L1B

Mission/Satellite TerraSAR-X1 (9 GHz)

Date 2008-01-05

Sensor Mode Spotlight

Incidence Angle 42°

Product Type Enhanced Ellipsoid Corrected

Spatial Resolution 1.4 m

Polarization HH

Lidar data

Mission PNOA (2009)

Pulse rate 45 kHz

Spatial resolution spacing 1.41m

Altimetric accuracy RMSEz 0.20 m

Deviation from vertical axis 5°

Orthophoto pixel size 0.25 m

5

9 classes

1 Sea

2 Swimming pools

3 Sand

4 Asphalt, Trees, Shrubs

5 Grass

6 Terrain, Buildings

Page 6: SAR and LIDAR data fusion: project presentation

Lidar intensity Lidar DEM (Digital Elevation Model)

Lidar orthophoto Lidar DSM (Digital Surface Model)

6

Images

SAR image

Aerial photo (©Google)

Page 7: SAR and LIDAR data fusion: project presentation

Sea and swimming pools masks

• Swimming pools• Highest reflectance in the blue wavelengths• Lower reflectance in the red wavelengths

Orthophoto NDSPI Swimming pools mask

Spectral signatures of five swimming pools

DEM mask

ON

SEA

Normalized Difference Swimming Pool Index

𝑁𝐷𝑆𝑃𝐼 =𝐵𝐿𝑈𝐸−𝑅𝐸𝐷

𝐵𝐿𝑈𝐸+𝑅𝐸𝐷𝑁𝐷𝑆𝑃𝐼 ∈ [−1,1]

• 95% of DNs ∈ [-0.2 and 0.2]• NDSPI provides the highest values in swimming pools𝑁𝐷𝑆𝑃𝐼 > 0.18 ⇒ swimming pools

NDSPI mask

POOLS

ON

7

Sea mask

Spectral signatures of three rivers

OFF OFF

Page 8: SAR and LIDAR data fusion: project presentation

Texture analysis on SAR data: dissimilarity

• Grey-level co-occurrence matrix (GLCM)

• Co-occurrence measures (Haralick features)

• Dissimilarity

𝑓𝐷𝐼𝑆 =

𝑖=0

𝑀−1

𝑗=0

𝑀−1

𝑖 − 𝑗 ∙ 𝑝(𝑖, 𝑗)

𝑝 𝑖, 𝑗 : (𝑖, 𝑗)th entry in the normalized co-occurrence matrix

M: number of grey levels (dimension of the matrix).

𝑓𝐷𝐼𝑆 increases linearly with increased contrast between neighbouring pixels.• Dunes (sand)

• dark background peppered by bright pixels

Orthophoto Dissimilarity Sand mask

• 𝑓𝐷𝐼𝑆 < 2 ⇒ sand

• 2 < 𝑓𝐷𝐼𝑆 < 9 ⇒ darkest pixels (terrain, asphalt, grass)• 𝑓𝐷𝐼𝑆 > 9 ⇒ brightest pixels (buildings, vegetation)

The dissimilarity mask 2 is applied to the lidar intensity

Dissimilarity Masked dissimilarity Masked lidarintensity

8

ONON

OFF OFF

SAND

Dissim. mask 2

Dissim. mask 1

-SEA-POOLS

Page 9: SAR and LIDAR data fusion: project presentation

Shrubs, trees and buildings masks

Multiple returns Shrubs mask Trees mask Buildings mask

Laser pulse can penetrate the tree canopy resulting in a multiple return.

𝐼𝑀𝐿 = 𝐼𝐹𝑅 − 𝐼𝐿𝑅

Normalized DSM: nDSM = DSM - DEM

𝐼𝑀𝐿 ≠ 0 𝑎𝑛𝑑 𝑛𝐷𝑆𝑀 > 2 ⇒ Trees𝐼𝑀𝐿 ≠ 0 𝑎𝑛𝑑 𝑛𝐷𝑆𝑀 < 2 ⇒ Shrubs𝐼𝑀𝐿 = 0 𝑎𝑛𝑑 𝑛𝐷𝑆𝑀 > 5 ⇒ Buildings

9

ON ON

ON

ON

OFF OFFOFF

OFF

TREES

SHRUBS BUILDINGS

Unclass.Dissim. mask 2

ML mask

nDSMmask

nDSMmask

nDSM

-SEA-POOLS-SAND

Page 10: SAR and LIDAR data fusion: project presentation

Asphalt, grass and terrain masks

• Manmade materials have the lowest lidar intensity returns (below the 50%)

• The masked lidar intensity image has 95% of the values between 0 and 90

𝐼𝐿𝐼𝐷 < 45 ⇒ asphalt

Masked lidar intensity Asphalt mask

Grass maskNDVI

Normalized Difference Vegetation Index

𝑁𝐷𝑉𝐼 =𝑁𝐼𝑅−𝑅𝐸𝐷

𝑁𝐼𝑅+𝑅𝐸𝐷, 𝑁𝐷𝑉𝐼 ∈ [−1,1]

Healthy vegetation falls between values of 0.20 to 0.80:

𝑁𝐷𝑉𝐼 ∈ 0.20, 0.80 ⇒ grass𝑁𝐷𝑉𝐼 ∉ 0.20, 0.80 𝑎𝑛𝑑 𝑛𝐷𝑆𝑀 = 0 ⇒ terrain Terrain mask

Dissim.mask

ON

10

ON ON

OFF

OFF

TERRAINGRASSASPHALT

NDVI mask

Lidar int. mask

OFF

-SEA-POOLS-SAND

nDSMmask

ON

OFF Unclass.

Page 11: SAR and LIDAR data fusion: project presentation

Priority classes algorithm 11

DEM mask

ON

SEA

NDSPI mask

POOLS

ON

OFF

ON

ON

ON ON

ON

ON

ON

OFF OFF OFF OFF

OFF OFF

OFF

OFFSAND

TREES

SHRUBS BUILDINGS

GRASSASPHALT

Unclass.

NDVI mask

Dissim. mask 2

ML mask

nDSMmask

nDSMmask

Lidar int. mask

Dissim. mask 1

TERRAIN

nDSMmask

ON

OFF Unclass.

Page 12: SAR and LIDAR data fusion: project presentation

12

Page 13: SAR and LIDAR data fusion: project presentation

Final fused classification image

SEA

POOLS

SAND

ASPHALT

GRASS

TERRAIN

HIGH VEG.

SHRUBS

BUILDINGS

13

UNCLASSIFIED: 6.5 %

Page 14: SAR and LIDAR data fusion: project presentation

Validation

• Confusion matrix and overall accuracy

• Four “macro-classes”: built soil, vegetation, bare soil, water

• Comparison with maximum likelihood (ML) classifier on lidar data alone ⇒ overall accuracy of 71.09 %

User/Ref.class (%)

Built soil Vegetation Water Bare soil

Built soil 87.50 3.13 9.38 6.25

Vegetation 0.00 78.13 0.00 3.13

Water 3.13 0.00 84.38 0.00

Bare soil 9.38 18.75 6.25 90.63

Tot. 100.00 100.00 100.00 100.00

Overall accuracy: 85.15 %

Confusion matrix for the fused data classification

14

Page 15: SAR and LIDAR data fusion: project presentation

Conclusions

• Overall accuracy ≈ 85% ⇒ improvement of 14 % as compared to lidardata alone

• Original algorithm

• Repeatability• High resolution SAR data

• Features (Haralick measures, DEM, DSM, multiple returns, NDVI)

• Training samples for thresholding

15