goes-r awg product validation tool development

36
1 GOES-R AWG Product Validation Tool Development AWG GRAFIIR Team June 16, 2011 Presented by: Mat Gunshor of CIMSS Ray Garcia, Allen Huang, Graeme Martin, Eva Schiffer, Hong Zhang and others (CIMSS/UW-Madison)

Upload: amalie

Post on 15-Jan-2016

29 views

Category:

Documents


0 download

DESCRIPTION

GOES-R AWG Product Validation Tool Development. AWG GRAFIIR Team June 16, 2011 Presented by: Mat Gunshor of CIMSS Ray Garcia, Allen Huang, Graeme Martin, Eva Schiffer, Hong Zhang and others (CIMSS/UW-Madison). Products. Baseline Products GRAFIIR can currently run (L1 and L2+): Radiances - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: GOES-R AWG Product Validation Tool Development

1

GOES-R AWG Product Validation Tool Development

AWG GRAFIIR TeamJune 16, 2011

Presented by: Mat Gunshor of CIMSS

Ray Garcia, Allen Huang, Graeme Martin, Eva Schiffer, Hong Zhang and others

(CIMSS/UW-Madison)

Page 2: GOES-R AWG Product Validation Tool Development

Products

Baseline Products GRAFIIR can currently run (L1 and L2+):•Radiances

– Validation of radiance data at the pixel level is a core function of GRAFIIR capability.•Clouds

– Clear Sky Mask; Cloud Optical Depth; Cloud Particle Size; Cloud Top Phase; Cloud Top Height; Cloud Top Pressure; Cloud Top Temperature

•Soundings– Legacy Vertical Moisture Profile; Legacy Vertical Temperature Profile; Derived

Stability Indices (CAPE, LI, etc); Total Precipitable Water.•Fire Hot Spot Characterization•Imagery•Derived Motion Winds•Land Surface/Skin Temperature•Hurricane Intensity•Volcanic Ash Detection

Baseline algorithms are currently produced in GEOCAT here.

2

Page 3: GOES-R AWG Product Validation Tool Development

Products

• In the future GRAFIIR expects to be able to run all of the AWG ABI baseline products by employing the AIT Framework as the processing end of the system.

• We expect that eventually all ABI Baseline and Option 2 products will be available to GRAFIIR via the AIT Framework

3

Page 4: GOES-R AWG Product Validation Tool Development

Products

4

WavelengthMicrometers

0.47 0.64 0.865 1.378 1.61 2.25 3.90 6.185 6.95 7.34 8.5 9.61 10.35 11.2 12.3 13.3

Channel ID 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16

Baseline Products                                

Aerosol Detection X X X X X X X             X X  

Suspended Matter/OD X X X   X X                    

Clear Sky Masks   X X X   X   X X X     X X  

Cloud & Moisture Imagery X X X X X X X X X X X X X X X X

Cloud Optical Depth   X     X X         X X

Cloud Particle Size   X     X X           X X

Cloud Top Phase               X X   X X  

Cloud Top Height                       X X X

Cloud Top Pressure                       X X X

Cloud Top Temperature                     X X X

Hurricane Intensity                     X

Rainfall Rate/QPE           X X X   X X Legacy Vertical Moisture Profile

            X X X X X X X X X

Legacy Vertical Temp Profile

            X X X X X X X X X

Derived Stability Indices             X X X X X X X X X

Total Precipitable Water           X X X X X X X X XDownward Solar Insolation Surf

X X X   X X                    

Reflected Solar Insolation TOA

X X X   X X                    

Derived Motion Winds   X       X X X X     X   Fire Hot Spot Characterization

  X         X     X X  

Land Surface Temperature                       X X  

Snow Cover X X X   X X X           X      

Sea Surface Temps             X       X   X X X

Bands may also be used by needed “upstream” products, such as the cloud mask.

Page 5: GOES-R AWG Product Validation Tool Development

Products

5

WavelengthMicrometers

0.47

0.64 0.865 1.378 1.61 2.25 3.90 6.185 6.95 7.34 8.5 9.61 10.35 11.2 12.3 13.3

Channel ID 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16

Option 2 Products                                

Cloud Layer/Heights                         X X X

Cloud Ice Water Path             X             X X  

Cloud Liquid Water   X     X         X           

Cloud Type                   X X     X X  

Convective Initiation               X X X X X   X X X

Turbulence                X        

Low Cloud and Fog   X       X                 

Enhanced-V/Overshooting Top                       X    

Aircraft Icing Threat   X       X X     X X   X X XSO2 Detection               X   X X X X  

Visibility (no direct use of ABI bands) X X X X X X X             X X

Upward Longwave Radiation (TOA)           X   X X   X   X

Downward Longwave Radiation (SFC)               X X X X X X X X

Upward Longwave Radiation (SFC)                   X   X X X  

Total Ozone             X X X X X X X X

Aerosol Particle Size X X X   X X                  

Surface Emissivity               X X X X X X X X X

Surface Albedo X X X   X X                    

Vegetation Index   X X                      

Vegetation Green Fraction   X X                          

Flood/Standing Water   X X   X             X X  Rainfall Potential (no direct use of ABI bands)

              X X X   X X  

Rainfall Probability (no direct use of ABI bands)

              X X X   X X

Snow Depth (no direct use of ABI bands) X X X   X X X           X      Sea & Lake Ice: Age (no direct use of ABI bands)

X X   X                 X X  

Sea & Lake Ice: Concentration   X X   X                 X X  

Sea & Lake Ice: Motion                           X    

Ocean Currents                           X    

Ocean Currents: Offshore                           X    

Page 6: GOES-R AWG Product Validation Tool Development

Validation Strategies

• GRAFIIR seeks to be able to validate all of the ABI L1 data and L2+ products in the context of analyzing ABI instrument waiver requests from the vendor– By manipulating ABI proxy data to reflect instrument effects, GRAFIIR

compares algorithm results “before” and “after” instrument effects are introduced to proxy data. The objective is to assess the effects of an instrument waiver on product performance for products that require the affected band(s).

• Current capability: product output comparisons, provide statistical analysis, and generate reports automatically through Glance.

• Future strategy: Obtain the AIT Framework in order to gain the capability of generating any ABI L2 product. – The Framework must be maintained and kept in sync with the AIT version.– It will remain important in the future to maintain synergy between NESDIS

scientists and the algorithm developers (at cooperative institutes, for example) by employing the same environment for development (The AIT Framework).

6

Page 7: GOES-R AWG Product Validation Tool Development

Routine Validation Tools

• GRAFIIR has developed validation tools as part of its mission to assess instrument effects on ABI data and products.– The idea of “routine” perhaps does not fit.

– Tool development has naturally grown to fit needs.

– Tools in use now are more of the deep-dive variety.

• GRAFIIR Vision: make the current tools more easily automatable– An automatable version of GEOCAT/Framework paired with Glance and

the collocation tools would give many product algorithm teams the ability to easily validate their products against a variety of “truth” datasets.

7

Page 8: GOES-R AWG Product Validation Tool Development

• The validation tools used for GRAFIIR vary due to the nature of the instrument waiver instead of based on product type.1. ABI instrument effects that can be applied to simulated ABI Proxy Data

from the WRF model.

2. ABI instrument effects that cannot be (easily/quickly) applied to simulated ABI Proxy Data from WRF model.

• The AWG GRAFIIR Team has responded to ABI waivers for both situations and have developed tools accordingly.

• The best validation tool GRAFIIR has is Glance– This is the most easily applicable, cross-cutting tool we have available

to other AWG teams and the AIT.

– Glance can be used with both L1 data and L2+ products.

8

”Deep-Dive” Validation Tools

Page 9: GOES-R AWG Product Validation Tool Development

9

”Deep-Dive” Validation Tools

Glance could make this easier!

Page 10: GOES-R AWG Product Validation Tool Development

10

”Deep-Dive” Validation Tools

Can you see a difference?

Page 11: GOES-R AWG Product Validation Tool Development

11

”Deep-Dive” Validation Tools

Page 12: GOES-R AWG Product Validation Tool Development

• The validation tools used for GRAFIIR vary due to the nature of the instrument waiver instead of based on product type.1. ABI instrument effects that can be applied to simulated ABI Proxy Data

from the WRF model.

2. ABI instrument effects that cannot be (easily/quickly) applied to simulated ABI Proxy Data from WRF model.

12

”Deep-Dive” Validation Tools

Page 13: GOES-R AWG Product Validation Tool Development

1. ABI instrument effects that can be applied to simulated ABI Proxy Data from the WRF model.– Example: Striping in one or more spectral bands.

– Example: Increased noise in one or more spectral bands.

– Example: Navigation errors in one or more spectral bands.• Note: ABI specifications exist for all of these parameters and a waiver is only

required for when the expected instrument performance will be worse than the specs.

• When the effect is relatively easy to simulate in the existing simulated ABI proxy data sets, the process if fairly straightforward.

13

”Deep-Dive” Validation Tools

Page 14: GOES-R AWG Product Validation Tool Development

1. ABI instrument effects that can be applied to simulated ABI Proxy Data from the WRF model.

• There are 4 primary steps to analyzing a waiver such as this:1. Simulate the instrument effect in the proxy data. (MATLAB)

• This is the least straight-forward part of the process – depending on what the waiver is for and how it is deemed to affect the radiance data.

2. Produce products that rely on the affected spectral bands using data before and after the instrument effect was introduced. (GEOCAT)• Could be done in the Framework; This is a straightforward step for an analyst

familiar with the software.

3. Compare “before” and “after” products; analyze differences (Glance)• Glance can read multiple file types and provide a variety of types of analysis

4. Obtain expert analysis of the results• Typically we get input from the algorithm scientists and generate a PowerPoint

presentation that also serves as a report.

14

”Deep-Dive” Validation Tools

Page 15: GOES-R AWG Product Validation Tool Development

• The following slides are an example of this first type of validation analysis– Existing proxy data are altered to reflect the effects of some out-of-spec

component of the instrument. In this case, we’re pretending that we have one line of the detector array in one spectral band that is noisier than it should be.

• First, the instrument effect is simulated in proxy data. The validation shown here is visual, but we do statistical validation in this step as well; for instance the random noise generated is tested (is it normally distributed noise, predictable standard deviation). This is done in MATLAB.

• Second, products are generated that use this spectral band. This step is done in Geocat or could be done in the Framework.

• Third, we analyze the difference in the products generated using “control” and “waiver” data. We only show cloud top height here. This step is done using Glance.

15

”Deep-Dive” Validation Tools

Page 16: GOES-R AWG Product Validation Tool Development

16

”Deep-Dive” Validation Tools

The Control Case: Magnified by 3x and focused on the Texas/Oklahoma border area convection where one of the out-of-spec lines passes through.

Page 17: GOES-R AWG Product Validation Tool Development

17

”Deep-Dive” Validation Tools

The Waiver Case: Magnified by 3x and focused on the Texas/Oklahoma border area convection where one of the out-of-spec lines passes through.

Page 18: GOES-R AWG Product Validation Tool Development

18

”Deep-Dive” Validation Tools

The Difference: Magnified by 3x, the out-of-spec line is evident in the brightness temperature difference image.

Page 19: GOES-R AWG Product Validation Tool Development

• http://cimss.ssec.wisc.edu/goes_r/grafiir/PC-1134/Clouds_epsilon0/• HTML Report generated by Glance with statistics and images.• The “Zero Tolerance” analysis shows all the absolute changes

introduced by one of out-of-spec line noise.• Follow the link for the statistical report (click on a product variable

name to see the reports for each one)– Cloud Top Height

– Cloud Top Pressure

– Cloud Top Temperature

– Cloud Mask (unaffected)

– Cloud Phase (unaffected)

– Cloud Type (unaffected)

19

”Deep-Dive” Validation Tools

Page 20: GOES-R AWG Product Validation Tool Development

20

”Deep-Dive” Validation Tools

Difference Image: Most of the image is a difference of 0. This is Cloud Top Height, but the image looks similar for Cloud Top Pressure and Cloud Top Temperature; from Glance.

Page 21: GOES-R AWG Product Validation Tool Development

21

”Deep-Dive” Validation Tools

“Trouble Points”: Trouble points are marked for any pixel in the two output files whose difference exceeds epsilon (which is 500m in this case). Cloud Top Height shown, from Glance.

Page 22: GOES-R AWG Product Validation Tool Development

22

”Deep-Dive” Validation Tools

The statistics are from Glance

Product Trouble Points

Trouble Point

Fraction

Max Difference

Mean Difference

Height (500m) 34 1.048e-05 6,456 0.2491

Pressure (50hPa)

14 4.317e-06 280.3 0.01163

Temperature (3K)

50 1.542e-05 47.47 0.002161

Cloud Mask 0 0 0 0

Cloud Phase 0 0 0 0

Cloud Type 0 0 0 0

Page 23: GOES-R AWG Product Validation Tool Development

2. ABI instrument effects that cannot be (easily/quickly) applied to simulated ABI Proxy Data from WRF model.– Example: Out-of-Spec Spectral Response Functions

• If the effect cannot be easily or accurately replicated in the simulated data it means we cannot generate products in GEOCAT and compare the outputs in Glance.– SRF changes are generally too time-consuming to get into the proxy

data because they involve altering the forward model which. Proxy data generated from forward model calculations using forecast model atmospheric profile information is temporally expensive to produce and we typically only have 1-2 weeks to respond to a waiver.

23

”Deep-Dive” Validation Tools

Page 24: GOES-R AWG Product Validation Tool Development

2. ABI instrument effects that cannot be (easily/quickly) applied to simulated ABI Proxy Data from WRF model.– Example: Out-of-Spec Spectral Response Functions

• There are 3 primary steps to analyzing a waiver such as this:1. Simulate the instrument effect in proxy data (MATLAB)

• For example, convolve high spectral resolution data with before/after SRFs

2. Since products cannot be generated, use alternatives (MATLAB)• In the case of SRFs, compare the brightness temperatures of convolved high spectral

resolution data (e.g. IASI) and compare differences to the spec noise to get an understanding of their significance.

• We have to be sure we are still measuring key components (e.g. SO2).• The products we can run in GEOCAT have all had analysis done on them previously

with “pure” proxy data compared to data with spec noise added.

3. Obtain expert analysis of the results• Typically we get input from the algorithm scientists and generate a PowerPoint

presentation that also serves as a report.

24

”Deep-Dive” Validation Tools

Page 25: GOES-R AWG Product Validation Tool Development

25

”Deep-Dive” Validation Tools

The following slides are an example of this second type of validation analysis– The 8.5um band SRF may be slightly out of spec– Will we still be able to see SO2?– How will the radiances be affected?

•First, SRFs must be obtained and altered. – We were given SRFs which were “compliant” and “non-compliant” with the

specs.– These are not being shown here to avoid ITAR designation

•Second, radiances and brightness temperatures are generated from both calculated and measured high spectral resolution data.

– We had some calculated spectra available with various amounts of SO2

•Third, we analyze the differences in the radiances of the compliant and non-compliant SRFs.

– These are compared to the spec noise for perspective.

Page 26: GOES-R AWG Product Validation Tool Development

26

”Deep-Dive” Validation Tools

Non-Compliant 8.5 SRF convolved with IASI

Compliant 8.5 SRF convolved with IASI

Brightness temperature difference Image

Page 27: GOES-R AWG Product Validation Tool Development

27

”Deep-Dive” Validation Tools

Ratio of the radiance difference (Spec Compliant minus Non-Spec Compliant) to the spec noise (NEdN) in this band (0.1303). When this ratio is less than 1 it means the difference is less than the spec NEdN.

Spec-Compliant – Non-Spec-Compliant 8.5um Radiance Difference to NEdN Ratio

Page 28: GOES-R AWG Product Validation Tool Development

28

”Deep-Dive” Validation Tools

• Using Glance to compare L2+ Product Output to a “truth” dataset that was generated as product output.– Most of GRAFIIR’s waiver tasks are to measure the effects of a change

on product output.

– But many algorithm teams have a need to validate their product against another type of measured data to quantify product performance.

• Glance can be used to do more validation.– As teams learn what their needs are and develop capabilities we hope

to be able to merge these ideas.

– Ideally, scientists should be doing more analysis and not have to worry about the traditionally difficult tasks of collocating data, processing it, etc.

Page 29: GOES-R AWG Product Validation Tool Development

29

”Deep-Dive” Validation Tools

• Example: Using Glance to compare WRF model output cloud top temperature to the AWG product algorithm cloud top temperature generated with the simulated Proxy ABI data (generated from the WRF-model output).– The WRF model cloud top information is treated as “truth”

– We expect there to be differences because the model reports a cloud top when it is too optically thin to be detected by ABI.

• So WRF model cloud tops should be higher and colder than those in the proxy data.

– Note: One file is an hdf file output from GEOCAT and the other is a netCDF generated from WRF model output.

Page 30: GOES-R AWG Product Validation Tool Development

30

”Deep-Dive” Validation Tools

• Cloud Top Temperature from ABI cloud algorithm (from Glance)

Page 31: GOES-R AWG Product Validation Tool Development

31

”Deep-Dive” Validation Tools

• Cloud Top Temperature from WRF (from Glance)

Page 32: GOES-R AWG Product Validation Tool Development

32

”Deep-Dive” Validation Tools

• Difference image, WRF output – Proxy L2 (from Glance)

Page 33: GOES-R AWG Product Validation Tool Development

33

”Deep-Dive” Validation Tools

• Statistics from Glance:Numerical Comparison Statistics correlation: 0.8433 diff_outside_epsilon_count: 965566 diff_outside_epsilon_fraction: 1 max_diff: 98.64 mean_diff: 12.69 median_diff: 9.739 mismatch_points_count: 1359122 mismatch_points_fraction: 0.4191 perfect_match_count*: 0 perfect_match_fraction*: 0 r-squared correlation*: 0.7112 rms_diff*: 18.97 std_diff*: 14.11

Page 34: GOES-R AWG Product Validation Tool Development

• GRAVA (GOES-R Advanced Validation Automation)– Greater automation of validation tasks.– An extension of GRAFIIR to optimize field campaign work for GOES-R– Beginning with a planned analysis of field campaigns to asses how to optimize

utilization of them for GOES-R Cal/Val.

• AIT Framework at CIMSS to expand our access to more products.– Converge on file types for inputs and outputs (avoid reliance on McIDAS AREA

files for input)– Converge on calibration methods (adopt the Imagery Team file format)– Converge on navigation (Fixed Grid Format)

• Merging the collocation capabilities with Glance should make validation easier for a host of algorithm teams.

• The GRAFIIR Team’s use of Glance thus far has been fairly limited in that comparisons are normally done as a before/after look at instrument effects on product performance. But Glance can be used to compare to a “truth” dataset that is not a prior run of the product.

34

Ideas for the Further Enhancementand Utility of Validation Tools

Page 35: GOES-R AWG Product Validation Tool Development

35

Summary

• Glance– The GRAFIIR team has helped to develop a validation tool which can be

used for both routine validation and as a deep-dive tool.– In analyzing multiple ABI waivers to date, the GRAFIIR team has been

doing both L1 radiance and L2 product validation already using Glance.– Glance can meet the needs of many product algorithm teams.

• GRAVA– A future extension of GRAFIIR for greater automation– Coordination of collocation, calibration/validation, field campaign and other

data, L1 and L2+ ABI data and products, visualization, and Glance.

• GOES-R AWG– The GRAFIIR toolset is not a replacement for science expertise.– Scientists should spend less time worrying about:

• File formats• Collocation

– The GRAFIIR team can help!

Page 36: GOES-R AWG Product Validation Tool Development

36

Summary

More Information

• How to Install Glance• Glance Documentation• Eva Schiffer <[email protected]>• Ray Garcia <[email protected]>• Mat Gunshor <[email protected]>