u.s. geological survey

15
U.S. Department of the Interior U.S. Geological Survey U.S. Geological Survey ASPRS LiDAR Calibration and QA telecon results ASPRS, 4 May 2011 Greg Stensaas Remote Sensing Technologies Project Manager Data Management Branch USGS/EROS Center Sioux Falls, SD

Upload: sirvat

Post on 24-Feb-2016

30 views

Category:

Documents


0 download

DESCRIPTION

U.S. Geological Survey. ASPRS LiDAR Calibration and QA telecon results ASPRS, 4 May 2011. Greg Stensaas Remote Sensing Technologies Project Manager Data Management Branch USGS/EROS Center Sioux Falls, SD. Background. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: U.S. Geological Survey

U.S. Department of the InteriorU.S. Geological Survey

U.S. Geological Survey

ASPRS LiDAR Calibration and QA telecon results

ASPRS, 4 May 2011

Greg Stensaas Remote Sensing Technologies Project ManagerData Management BranchUSGS/EROS Center Sioux Falls, SD

Page 2: U.S. Geological Survey

Background· Currently, the LiDAR system calibration is defined by a

handful of parameters. There is an ongoing effort to consistently derive these parameters for every project.

· However, there exists a lacuna in the understanding of their relationship to the accuracy of the final data and their products on the ground.

· This has severely restricted the ability of local and state governments to fully leverage the potential of LiDAR data.

· The solution to the problem requires a thorough analysis and definition of the calibration parameters and their effects on ground accuracy, and the definition of a common process via ASPRS.

2

Page 3: U.S. Geological Survey

Background

· Discussion at Fall ASPRS (and many previous ASPRS presentations and committee meetings)· on the strong need for common cal/val and QA

processes· USGS Specification v13 and associated QA/

QC needs· 4 monthly telecons and ILMF discussions · LiDAR QA/QC Face-to-Face meeting , Friday May

2, 2011, ASPRS annual conference, Milwaukee

3

Page 4: U.S. Geological Survey

Objective· The objective of the Face-to-Face meeting was to elaborate on

LiDAR QA/calibration activities of the last few months and assign tasks and actions. The meeting seek to establish the need and actions for solving and documenting the LiDAR cal/val and QA issues, and define how to get it done.· Agenda:

· Welcome, Introduction, and Purpose of the Meeting· LiDAR QA/QC issues/problems· Summarize the previous 4 Telecons· Discuss LiDAR specifications: Write, review and edit LiDAR QA/calibration terms,

processes· The LiDAR Calibration spreadsheet· Call for volunteers for writing, reviewing and editing.· Discuss the format of specifications for the QA/calibration documents. Models include

LAS, etc.· Define the Way Forward

4

Page 5: U.S. Geological Survey

Purpose· During the telecons and F2F meeting, we have agreed on the

need for a coordinated QA/calibration process for LiDAR. · This effort has resulted in a lot of important discussion that

will hopefully lead to an ASPRS documentation of QA/calibration processes of LiDAR.

· Currently, the members attending the telecon are in the process of defining the salient terminology, definitions and concepts required to unambiguously describe the QA/calibration process.

· In this regard, a matrix sheet synthesizing these activities has been generated, and members are requested to volunteer to document, review and edit these LiDAR QA/calibration processes, terms and definitions based on the matrix.

5

Page 6: U.S. Geological Survey

6

Identification Schematic Illustration of Manifested Error

Quantification Measurement Reporting

What is it that I don't want to see in the data? What do we name this type of defect? What are the characteristics of the identified type of error?

What does the error look like? How do I know it when I see it? What is the math model that describes it?

What level of this effect is acceptable in the data set? Is it a percentage of swath? Percentage of another spec (e.g., elevation spec)? Fixed value?

How do I determine what amount of this identified anomaly is acceptable in the data? Algorithms? Sample size (number)? Sample size (area and shape)? Location of sample? Number of samples? What is the nature of the control data required? Control Features vesus Control Point, and accuracy of controla data required.

How should the result be reported? A readable document? Point Cloud Processor log file or regristry files? Flight line by flight line basis? Lift by lift basis? What units should be used to describe the extent of any given error type?

latitude error to ground controllongitude error to ground controlelevation error to ground controlroll errorpitch errorheading error"pop-ups" on reflective stripingscan-to-scan errors due to internal alignments of laser to scanning device

intra-scan errors due to internal alignments of laser to scanning device

positive-to-negative scan differences (typically elevation differences between left-bound and right-bound scan, typically worst near the edge of swath)

harmonic effects that may go in and out of phase along the flight line; normally observed as anomaly from scan line to scan lineharmonic effects that may go in and out of phase within a scan; normally observed as anomaly from scan line to scan line

etc.

Error Class

Spatial Accuracy

Error Process Spreadsheet

Page 7: U.S. Geological Survey

7

Identification Schematic Illustration of Manifested Error

Quantification Measurement Reporting

What is it that I don't want to see in the data? What do we name this type of defect? What are the characteristics of the identified type of error?

What does the error look like? How do I know it when I see it? What is the math model that describes it?

What level of this effect is acceptable in the data set? Is it a percentage of swath? Percentage of another spec (e.g., elevation spec)? Fixed value?

How do I determine what amount of this identified anomaly is acceptable in the data? Algorithms? Sample size (number)? Sample size (area and shape)? Location of sample? Number of samples? What is the nature of the control data required? Control Features vesus Control Point, and accuracy of controla data required.

How should the result be reported? A readable document? Point Cloud Processor log file or regristry files? Flight line by flight line basis? Lift by lift basis? What units should be used to describe the extent of any given error type?

Error Classconsistent radiometry flight line to flight line"blooming" or saturation - similar to over-exposure in photos"drop-outs" on low-reflectivity surfaces

intensity output calibration - if there is a desire to output a reflectivity value as opposed to an intensity valuescan-line-to-scan-line radiometry variation (stripes of over- or under-exposed data)intra-scan-line radiometric consistency ("feathers" at high-contrast boundaries)positive-to-negative scan differences (typically intensity differences between left-bound and right-bound scan, typically worst near the edge of swath)

etc.

Radiometric Accuracy

average point densityworst-case point densityworst-case cross-track spacingworst-case along-track spacingworst-case cross-track:along-track spacing ratiomax cut-off, or equivalent

etc.

NOTE: Fugro Horizons paper on point pattern nominal post spacing, quantifies deviation from idealized raster pattern

Point Pattern

Page 8: U.S. Geological Survey

Next Steps

· Continue monthly telecons, establish working group face to face meetings

· Include additional interested Airborne and Mobile Mapping LiDAR Sub-committees and PDAD members

· Define outline and matrix the work· Provide enhanced work matrix and obtain documents· Data Link -

ftp://edcftp.cr.usgs.gov/edcuser/stensaas/outgoing/LiDAR%20Calibration/

• Compile input and peer review• Continue to support ASPRS LiDAR QA and calibration

guidelines and best practices.• Many support groups including work by NGA GWG• Draft by Fall ASPRS

8

Page 9: U.S. Geological Survey

Questions?

9

Page 10: U.S. Geological Survey

3 InchUSGS Cal/Val Basemap range: hi res image and LiDAR data

12 Inch

6 Inch

Geometric Targets and Control

Large area Geometric Test Range

Minnehaha County

Sioux Falls

Lincoln County

During the research effort, ranges were prepared as part of the preparation to support Sensor Assessment and Data Provider Evaluation

Operational Data Provider evaluation process is now stopped

Research Evaluation of Sensors Only

Developing Cal/Val Range Stds. & 5 National Ranges

Dual use for hi-res ortho & satellite, & LiDAR cal/val

Data Provider Evaluation & Cal/Val Range Creation

10

Page 11: U.S. Geological Survey

USGS National Range Locations

Sioux Falls, SD; Rolla, MO and Pueblo, CO Ranges Completed Airy, North Carolina and Rochester, NY Ranges In-Process

11

Page 12: U.S. Geological Survey

Terrestrial LiDAR collected by USGS, Vivian Queija · USGS EROS by Vivian Queija on June 21-22, 2010· Sioux Falls, SD on June 23, 2010· Test data only; interested in point cloud and test

range model; what do we need for targets and good performance testing

· Note: Images quick look only and are not fully processed

12

Page 13: U.S. Geological Survey

Front view of the water tower colored with RGB. Same scan with points colored by intensity.

Same scan, rotated for back view.

Page 14: U.S. Geological Survey
Page 15: U.S. Geological Survey