improving high impact weather forecasts by...
TRANSCRIPT
7/12/2010
1
Improving high impact weather forecasts by adaptive observing and data processing
methods
Yucheng Song1 and Zoltan Toth2
DAOS, Montreal, July 8-9, 2010
1. NOAA/NWS/NCEP/EMC
2. NOAA/OAR/ESRL/GSD/FAB
Acknowledgments• Craig Bishop – NRL• Rolf Langland – NRL• Ron Gelaro - NASA• Istvan Szunyogh – TAMU• Sharan Majumdar – Univ. of Miami• Lacey Holland – 3Tier• Lacey Holland – 3Tier
• NWS field offices (WFO), HPC/NCEP and SDMs• NOAA G-IV and the USAF C-130 flight crews• Jack Parrish, Paul Flaherty (NOAA AOC)• Jon Talbot (UASF)• Dave Novak (HPC)• CARCAH (John Pavone)• Mark Moran (NOAA)
DAOS, Montreal, July 8-9, 2010
• Mark Moran (NOAA)• Jack Woollen - EMC• Russ Treadon - EMC• Mark Iredell - EMC
• + others who have contributed!
7/12/2010
2
Historical OverviewSubjective targeting of Tropical Cyclones• 1982 NOAA HRD began research flights for hurricane track forecast• Long lifecycle of TCs, no significant downstream development• Burpee et al. 1996: 25% improvement in hurricane tract forecast• NOAA procurement of Gulf Stream IV (G-IV) CALJET and
PACJET
Development of objective targeting methods for extratropical applications
• Subjective methods unreliable due to Rossby Wave propagation• Snyder 1996: targeted observation ideas were discussed in a 1995 workshop• Different targeting techniques developed:
Singular Vectors (e. g., Gelaro et al. 1999)Adjoint sensitivity calculation (e.g., Langland and Rohaly 1996)Linear inverse technique (Pu et al. 1997)ET (Bishop and Toth 1996; 1999)
DAOS, Montreal, July 8-9, 2010
( p ; )ETKF (Bishop et al. 2001, Majumdar et al. 2002)
Field testing of methods• FASTEX (Joly et al. 1999, Toth et al.1998, Szunyogh et al 1999, Pu and Kalnay 1999):
11 countries, 1996-97 winter (Jan 6 – Feb 28)• NORPEX (Szunyogh et al. 1999b; Langland et al. 1999, Liu and Zhou 2001): Jan 14 –
Feb 27 1998, multi-agency, northeastern Pacific • CALJET and PACJET (Toth et al. 2000): land-falling jets experiments (0-24hr)
OPERATIONAL IMPLEMENTATION•Pre-implementation testing
•Experimental WSR-2000
•80% cases improved (Toth et al…)
•Operational implementation at NWSOperational implementation at NWS
•WSR 2001, all aspects run operationally
•Continual improvements to software based on user input (ensemble membership increases from 35 to 177)
•Significance and feedbacks
•News media report each year
DAOS, Montreal, July 8-9, 2010
•Excellent educational program for school teachers/students
•Alert public for severe winter snow storm awareness
•NOAA G-IV available hours + C-130s free flight hours
7/12/2010
3
Operational ImplementationPre-implementation testing• The success of the early field experiments led to the WSR establishment in 1999
(Toth et al. 1999): to reduce forecast errors for significant winter weather events over CONUS in the 24-96 hour lead time range
• Data collected by G-IV and C-130s (operated by NOAA AOC and USAF)• Verification results (Szunyogh et al. 2000, 2001) indicate the majority of the
targeted forecasts are significantly improved• Pre-implementation testing
• Experimental WSR-2000• 80% cases improved (Toth et al…2000)Operational implementation at NWS
• 2001, WSR became a regular program of NWS (Toth et al. 2001)• Langland 2005 Issues in targeted observing: a very good review on field programs
DAOS, Montreal, July 8-9, 2010
Langland 2005, Issues in targeted observing: a very good review on field programs since 1997 “The primary scientific challenges for targeting include the refinement of objective methods that can identify optimal times and locations for targeted observations” –Rolf Langland
References– Song et al. 2010
– Hodyss, et al. 2007
– Sellwood et al. 2008
– Majumdar, S. J., K. J. Sellwood, D. Hodyss, Z. Toth and Y. Song, 2010
– Song et al 2005-2010, Several PPT presentations at space-based Lidar group meetings and conferences
– Bergot, T., G. Hello, and A. Joly, 1999: Adaptive observations: a feasibility study. Mon. Wea. Rev., 127, 743-765.Bergot, T., G. Hello, and A. Joly, 1999: Adaptive observations: a feasibility study. Mon. Wea. Rev., 127, 743 765.– Bishop, C. H., B. J. Etherton, and S. J. Majumdar, 2000: Adaptive sampling with the Ensemble Transform Kalman
Filter. Part I: Theoretical aspects. Mon. Wea. Rev., in print.– Bishop, C., and Z. Toth, 1996: Using ensembles to identify observations likely to improve forecasts. Preprints of the
1 1 th AMS Conference on N u merical Weather Prediction, 1 9-23 August 1996, Norfolk, Virginia, p. 72-74.– Bishop, C. H. and Z. Toth, 1999: Ensemble transformation and adaptive observations. J. Atmos. Sci. 56, 1748-
1765.– Bowie, Edward H., 1922: Formation and movement of West Indian hurricanes. Mon. Wea. Rev., 50, 173-190.– Buizza, R. and Montani, A., 1999: Targeting observations using singular vectors. J. Atmos. Sci., 56, 2965-2985.– Buizza, R., Petroliagis, T., Palmer, T. N., Barkmeijer, J., Hamrud, M., Hollingsworth, A., Simmons, A., and Wedi,
N., 1998: Impact of model resolution and ensemble size on the performance of an ensemble prediction system. Q. J. R. Meteorol. Soc., 124, 1935-1960.
– Burpee, R. W., J. L. Franklin, S. J. Lord, R. E. Tuleya, and S. D. Aberson, 1996: The impact of Omega dropwindsondes on operational hurricane track forecast models. Bulletin of the American Meteorological Society, 77 925-933
DAOS, Montreal, July 8-9, 2010
77, 925 933.– Derber, J., H.-L. Pan, J. Alpert, C. Caplan, G. White, M. Iredell, Y.-T. Hou, K. Campana and S. Moorthi, 1998:
Changes to the 1998 NCEP Operational MRF model Analysis-Forecast system. NOAA/NWS Tech. Procedure Bull. 449, 16 pp. [Available from Office of Meteorology, National Weather Service, 1325 East-West Highway, Silver Spring, MD 20910.]
– Gelaro, R., R. H. Langland, G. D. Rohaly, T. E. Rossmond, 1999: An assessment of the singular-vector approach to target observations using the FASTEX dataset. Quart, J. Roy. Meteor. Soc., 125, 3299-3328.
– Gregg, W. R., 1920: Aerological observations in the West Indies. Mon. Wea. Rev., 48, 264.– Franklin, J. L., S. E. Feuer, J. Kaplan, and S. D. Aberson, 1996: Tropical cyclone motion and surrounding flow
relationships: Searching for beta gyres in Omega dropwindsonde datasets. Mon. Wea. Rev, 124, 64-84.– Joly, A., K. A. Browning, P. Bessemoulin, J.-P. Cammas, G. Caniaux, J.-P. Chalon, S. A. Clough, R. Dirks, K. A.
Emanuel, L. Eymard, F. Lalaurette, R. Gall, T. D. Hewson, P. H. Hildebrand, D. Jorgensen, R. H. Langland, Y. Lemaitre, P. Mascart, J. A. Moore, P. O. G. Persson, F. Roux, M. A. Shapiro, C. Snyder, Z. Toth, R. M. Wakimoto, 1999: Overview of the field phase of the Fronts and Atlantic Storm-Track Experiment (FASTEX) project. QJRMS, 125, in print.
7/12/2010
4
Winter Storm Reconnaissance Program
Objective:
Improve Forecasts of Significant Winter Weather Events Through TargetedObservations in Data Sparse Northeast Pacific Ocean in the 24-96 hour lead time range over CONUS
Adaptive approach to collection of observational data:
Operational since January 2001
Adaptive approach to collection of observational data:1) Only prior to significant Winter Weather events of Interest2) Only in areas that influence high impact event forecasts
Results:70+% of Targeted Numerical Weather Predictions Improve10-20% error reduction for high impact event forecasts12-hour gain in predicting high impact events – earlier warnings possibleWSR data has 2.7 times more impact per obs than RAOB observations
2008 P-3 out of Portland, WA
DAOS, Montreal, July 8-9, 2010
WSR LOGISTICS•Identify threatening events (at 3-6 day lead time)
•Time - By 04 UTC•Who – Operational NWS field and NCEP forecasters•Tools – Ensemble-based probabilistic forecasts•Outcome – Time/location/nature of event
Determine sensitive areas•Determine sensitive areas•Time – By 06 UTC•Who – Senior Duty Meteorologist at NCEP•Tools – ETKF, using NCEP, CMC, and ECMWF ensembles (177 Members)•Outcome – Objective guidance (expected impact of data on forecast, best track)
•Decision about adaptive data collectionTi B 08 UTC
DAOS, Montreal, July 8-9, 2010
•Time – By 08 UTC•Who – SDM•Tools – Objective ETKF guidance•Outcome – Email about flight plans or lack of
7/12/2010
5
WSR LOGISTICS - 2•Collect adaptive observations
•Time – Around 00 UTC the day after request made•Who - NOAA Aircraft Operations Center (AOC) & USAF Reserve•Tools – Manned aircraft (G-lV, C130s)•Outcome – Quality controlled dropsonde data onto GTS
Assimilate adaptive observations along with observations from routine•Assimilate adaptive observations along with observations from routine networks
•Time – By 0230 UTC•Who – NCEP Central Operations•Tools – Operational GSI & GFS systems•Outcome – Global analysis and forecast reflecting impact of adaptive observations
•Evaluate impact of adaptive observationsTi N l ti ff li
DAOS, Montreal, July 8-9, 2010
•Time – Near real time or off-line•Who – EMC personnel•Tools – Reduced resolution version of operational GSI/GFS systems•Outcome – Verification statistics
WSR website has more informationhttp://www.emc.ncep.noaa.gov/gmb/targobs/index_files/wsr.htm
DAOS, Montreal, July 8-9, 2010
7/12/2010
6
EVALUATION OF RESULTS•Case studies
•Subjective comparison of analyses/forecasts with/without adaptively collected data
•Can we predict forecast impact of adaptive data?•Compare spatiotemporal evolution of predicted and observed data impact•Three experiments used
•How much forecasts improved?•Compare errors in forecasts from analyses with/without targeted data
•Use both observations and analysis fields as proxy for truth•Are benefits from targeted obs exceed its costs?
•Beyond scope of this study
DAOS, Montreal, July 8-9, 2010
•Must be evaluated for all components of observing network•Importance of OSSE studies and socio-economic considerations
TargetArea Verification
Area
Applying ETKF targeting technique-
Storm
Dropsondes to be made by G-IV
DAOS, Montreal, July 8-9, 2010
The ETKF spotted the best targeting track
Expected error reduction propagation based on the best flight track
Evaluated by 3-level energy norm of u, v, T
7/12/2010
7
ETKF Forecast vs. Actual Data Impact (Feb 18-Feb 22 2007)
DAOS, Montreal, July 8-9, 2010
mpa
ct
mpa
ct
degr
adat
ion
edic
ted
data
im
bser
ved
data
im
mpr
ovem
ent
/d
14
Pre Ob
For
ecas
tim
7/12/2010
8
5 Worst cases ET KF signal vs. GFS squared signal
Can ET KF predict signal variance?
5 Best cases ET KF
DAOS, Montreal, July 8-9, 2010
5 Best cases ET KF signal vs. GFS squared signal – 120hrs
From Sellwood et al 2008 QJRMS
Valentine’s day Storm• Weather event with a large societal impact• Each GFS run verified against its verified against its own analysis – 60 hr forecast• Impact on surface pressure verification• RMS error improvement: 19 7%
Surface pressure from analysis(hPa; solid contours)Forecast Improvement (hPa; shown in red)Forecast Degradation (hPa;
DAOS, Montreal, July 8-9, 2010
improvement: 19.7%(2.48mb vs. 2.97mb)
Targeted in high impact weather area marked by the circle
blue)
7/12/2010
9
WSR evaluation 2004-2008From 2004-2007 (129 cases) WSR• 71.3% improved while 27.9% degraded• On average there is 10-20% forecast error reduction (in terms
of fit to obs RMSE) within the significant high impact eventof fit to obs RMSE) within the significant high impact event areas
• Better forecasts translate into a 12-hour gain in predictability (48-hr forecast with targeted data as skillful as 36-hr forecast without)
• 2005, 2007 best years
From 2008 WSR + HMT collaboration
DAOS, Montreal, July 8-9, 2010
• G-IV not available• Shorter and lower altitude flights with P-3 targeting mesoscale
shorter range forecasts– Sampling the Atmospheric River effect had some moderate positive
impact
Forecast Verification for Wind
10-20% rms error reduction in winds
DAOS, Montreal, July 8-9, 2010
RMS error reduction vs. forecast lead time
7/12/2010
10
Forecast Verification for Wind
10-20% rms error reduction in Temperature
DAOS, Montreal, July 8-9, 2010
RMS error reduction vs. forecast lead time
60 hr forecast is equivalent to 48hr forecast
WSR Summary statistics (4 years)
Variable# cases
improved# cases neutral
#cases degraded
Surface pressure 21+20+13+25=79 0+1+0+0=1 14+9+14+12=49
Temperature 24+22+17+24=87 1+1+0+0=2 10+7+10+13=40
Vector Wind 23+19+21+27=90 1+0+0+0=1 11+11+6+10=38
Humidity 22+19+13+24=78 0+0+0+0=0 13+11+14+13=51
25+22+19+26 = 92 OVERALL POSITIVE CASES.
DAOS, Montreal, July 8-9, 2010
0+1+0 +0 = 1 OVERALL NEUTRAL CASES.
10+7+8 +11 = 36 OVERALL NEGATIVE CASES.
71.3% improved27.9% degraded
7/12/2010
11
Statistical significance
Suppose the chance of getting improvement from extra adaptiveobservations is 0.5 – that is our null hypothesis
P (36 out of 128) = 2.44917878992887E-07
This means that the chance of the results is a chance is very low
DAOS, Montreal, July 8-9, 2010
Vector Wind (m/s) - WSR 200720
Typical adaptive observation data impact
-4
-3
-2
-1
0SatwindAMSU-ASSMI/PRHRaobDropsondeAircraftLand Sfc
24h moist total energy error norm (J kg-1)
Error Reduction
5
5
10
15
5 10 15 20
RM
S E
RR
OR
WIT
HO
UT
DR
OP
SO
ND
E
-3
-2-1
0 SatwindAMSU-ASSMI/PRHRaobDropsonde
Error Reduction
5
-8
-7
-6
-5Land SfcScatwindWindsatModisShip SfcSSMI/Wnd
Sum Impact
(x 1.0e5)
DAOS, Montreal, July 8-9, 2010
5 10 15 20RMS ERROR WITH DROPSONDES
Forecast fits to RAOB (RMSE) verification summed over vertically (1000-100mb) and horizontally 1000km radius (CONUS) by NCEP/GFS
Rolf Langland
-10
-9-8
-7
-6-5
-4 DropsondeAircraftLand SfcScatwindWindsatModisShip SfcSSMI/Wnd
(x 1.0e5)
1-31 Jan 2007 (00UTC analyses)
Per-Ob
7/12/2010
12
Precipitation verification (2005)
• Precipitation verification is difficult due to the lack of station observation data in some regions.
18 5616 35CTL
10mm 5mm ETS
23
20.4416.50OPR
18.5616.35CTL
3:14:1Positive vs. negative
cases
OPEN QUESTIONS• Does operational targeting have economic benefits?
– Cost-benefit analysis needs to be done for different regions – SERA research• Are there differences between Pacific (NA) & Atlantic (Europe)?
• Can similar or better results be achieved with cheaper observing systems?– Observing systems of opportunity
• Targeted processing of satellite data• AMDAR
– UAVs?
• Sensitivity to data assimilation techniques– Advanced DA methods extracts more info from any data
• Better analysis without targeted data• Larger impact from targeted data (relative to improved analysis with standard data)?
24
• What are the limitations of current techniques?– What can be said beyond linear regime?
• Need larger ensemble for that?– Can we quantify expected forecast improvement (not only impact)?
• Distinction between predicting impact vs. predicting positive impact– Effect of sub-grid scales ignored so far
• Ensemble displays more orderly dynamics than reality?– Overly confident signal propagation predictions?
7/12/2010
13
Study the lifecycle of perturbations as they originate from the tropics, Asia, and/or the polar front, travel through the Pacific waveguide, and affect high
impact wintertime weather events over North America and the Arctic
MAIN THEME OF WINTER T-PARC
Tropical flare-ups in western Pacific (IR)
merge with
Waves on westerly flowto influence
Deep cyclogenesis in northeast Pacific
Verification region, 12UTC 14 Oct
Sensitive area 1, 00UTC 11 Oct
Sensitive area 2, 00UTC 11 Oct
Captured by Ensemble Transform KF targeting method
7/12/2010
14
Day -4-6
RAWIN
Extensive observational platforms during T-PARC winter phase allow us to track the potential storms and take additional observations
as the perturbation propagate downstream into Arctic and North America
ENHANCED OBSERVING PLATFORMS
Lead by NCEP Multi-agency coordination
RAWIN
Russia
Day -3-5
G-IV
Day -1-3
C-130
G-IV
CONUS
VR
Day -5-6
E-AMDAR
Alaska
VR
DAOS, Montreal, July 8-9, 2010
Winter T-PARC 2009 –
A THORPEX field campaign
Winter T-PARC platform statistics
NOAA G-IV:
24 successful missions, 201hrs flown with 456 dropsondes
USAF C-130s:
14 successful missions, 142.8hrs flown with 212 dropsondes
Out of Japan during Jan 11 to Feb 26, 2009 Out of Alaska during Jan 20 to Feb 13, 2009
E-AMDAR from Lufthansa airline:
(Descents and Ascents: boxed area)
Total: 802+1103=1905 profiles
Enhanced Russian RAOB coverage:
Total 602 extra radiosondes released
from 37 selected stations for 33 cases
DAOS, Montreal, July 8-9, 2010
Total: 802+1103 1905 profiles
7040+10600=17640 en route obs
From Jan 11 – Feb 28, 2009
from 37 selected stations for 33 cases
From Jan 12 to Feb 28, 2009
7/12/2010
15
The main website is the central place for forecast products and discussions
A plethora of products have been used to provide sound guidance to thesound guidance to the adaptive platforms
Significant amount of coordination among different participants
1. Track redesign due to length change or weather pattern changes
DAOS, Montreal, July 8-9, 2010
pattern changes
2. Different time zones
3. Air traffic control
4. “Lost in translation”
March 1, 2009 CA Storm
• Weather event with a large societal impact• Each GFS run verified against its verified against its own analysis – 60 hr forecast• Impact on surface pressure verification• RMS error improvement: 35 2%
DAOS, Montreal, July 8-9, 201030
improvement: 35.2%(7.07mb vs. 9.56mb)
Targeted in high impact weather area marked by the circle
Surface pressure from analysis(hPa; solid contours)Forecast Improvement (hPa; shown in red)Forecast Degradation (hPa; blue)
7/12/2010
16
Large area of sensitivity – which may not be covered well by the limited dropsondes from manned aircraft
Satellite data processing?
DAOS, Montreal, July 8-9, 2010
Atlantic coast braces for biggest snowstorm of the season March 2, 2009, heavy rainfall near CA on the 27th, Feb
p g
Example: Mar 2, 2009Atlantic coast braces for biggest snowstorm of the season March 2, 2009, heavy rainfall over California
TRENTON, N.J. - (March. 2, 2009 ) A ferocious storm packing freezing rain, heavy snow and furious wind gusts paralyzed most of the East Coast on Monday, sending dozens of cars careening into ditches, grounding hundreds of flights and closing school for millions of kids. – From MSNBC
FORECAST WITH TARGETED DATA
ANALYSIS
FORECAST WITH TARGETED DATA
DAOS, Montreal, July 8-9, 201032
FORECAST WITHOUT TARGETED DATA
7/12/2010
17
March 1, 2009 CA Storm sequence
Forecast improvementRed: forecast improvement Blue: degradation
Overwhelming improvement!
DAOS, Montreal, July 8-9, 201033
Area averaged fit of surface pressure:Difference of forecast-analysis
Typical adaptive observation data impact W-TPARC
error reduction is NEGATIVE, units are J/ kg
DAOS, Montreal, July 8-9, 2010
Forecast fits to RAOB (RMSE) verification summed over vertically (1000-100mb) and horizontally 1000km radius (CONUS) by NCEP/GFS
TARGETED DROPSONDE IMPACT ON 24H FORECAST ERROR IN NOGAPS/NAVDAS
Rolf Langland
7/12/2010
18
T-PARC Summary statistics
Variable# cases
improved# cases neutral
#cases degraded
Surface pressure 37 0 15
Temperature 35 0 17
Vector Wind 36 0 16
Humidity 28 0 24
35
39 OVERALL POSITIVE CASES.
13 OVERALL NEGATIVE CASES.75% improved25% degraded
OPEN QUESTIONS
• NCEP targeted data impact evaluation– Forecast error reduced in 75% of cases
• Is this good or not?• Context – Comments/thought experiment by Olivier Talagrand
– Assume a second, independent set of observations available• With identical characteristics to existing observing network
– Assume independent data assimilation / forecast cycle is conducted using 2nd set of data
– Assume analyses using 1st & 2nd set of data are properly combined to improve final analysis / forecast
– Question: Quality of combined analysis using doubled observing network supersedes that based on single network in what percentage of cases?
• ~66%
DAOS, Montreal, July 8-9, 2010 36
• =>
Winter T-PARC targeted observing strategy amounts to more than doubling the effectiveness of the observing network for selected high impact winter storm
forecast cases
7/12/2010
19
Future plans: Improving high impact weather forecasts by adaptive observing and data processing methods
The proposed research will focus on the following aspects:
• Re Evaluation of past WSR/winter T PARC adaptive observations with a• Re-Evaluation of past WSR/winter T-PARC adaptive observations with a modern version of data assimilation/forecast system (e.g. GDAS/GFS) (re-run/mini-reanalysis)
• Making winter T-PARC 2009 data archive available to the research communities
• Adaptive targeting tools for automatic identification of sensitive regions for global data collection (a precondition for optimal data collection for DA)
• Satellite data processing: Enhanced GSI assimilation of satellite radiance t GSI th d Th bj ti i t d t t th th d
DAOS, Montreal, July 8-9, 2010
vs. current GSI method. The objective is to demonstrate these methods in an operational system.
• Adaptive methods for assimilation of targeted data and evaluation of the data impact.
• UAS, LIDAR, Gulf of Mexico flights inclusion• Partially funded by NOAA THORPEX program, jointly with ESRL/GSD
Observations: Future Directions• Manned aircraft
– C-130s from the Gulf of Mexico (WSRP) missions– NOAA P-3s out of West Mexico
T k h– Track changes • UAS
Global Hawk etc.• Satellite data adaptive processing• Satellite wind adaptive observation• Global wind Lidar – adaptive design
DAOS, Montreal, July 8-9, 2010
Global wind Lidar adaptive designDirect detection mode vs. coherent detection
mode
7/12/2010
20
Inclusion of C-130s from Gulf of Mexico
DAOS, Montreal, July 8-9, 2010
An example for Winter 2010 East Coast snow storm on Feb 6
Inclusion of Atlantic WSRP: tracks modified
DAOS, Montreal, July 8-9, 2010
7/12/2010
21
Global Hawk – long duration
DAOS, Montreal, July 8-9, 2010
Average surface pressure forecast error reduction from WSR 2000
The average surface pressure forecast error reduction for Alaska (55°–70°N, 165°–140°W), the west coast (25°–50°N, 125°–100°W), the east coast (100°–75°W), and the lower 48 states of the United States (125°–75°W). Positive values show forecast improvement, while negative values show forecast degradation
(From Szunyogh et al 2002)
7/12/2010
22
From IRVINE et al. 2010
Replacing 2 sondes with two sondes released further from the orographyincreased the maximum improvement from 7% to 18%.
New metrics of evaluation
Energy norm evaluation on 1000km regions, weight should be considered