validation of the fsu/coaps climate model

19
Validation of the Validation of the FSU/COAPS Climate Model FSU/COAPS Climate Model Mary Beth Engelman [T. LaRow, D.W. Shin, S.Cocke, and M. Griffin] CDPW Meeting – Tallahassee, FL October 22, 2007

Upload: martin-spears

Post on 30-Dec-2015

23 views

Category:

Documents


0 download

DESCRIPTION

Mary Beth Engelman [T. LaRow, D.W. Shin, S.Cocke, and M. Griffin] CDPW Meeting – Tallahassee, FL October 22, 2007. Validation of the FSU/COAPS Climate Model. Outline:. Introduction to work Background (Previous Studies) Data Methodology Results Summary/Conclusions Future Work. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Validation of the FSU/COAPS Climate Model

Validation of the FSU/COAPS Validation of the FSU/COAPS Climate ModelClimate Model

Mary Beth Engelman

[T. LaRow, D.W. Shin, S.Cocke, and M. Griffin]

CDPW Meeting – Tallahassee, FL

October 22, 2007

Page 2: Validation of the FSU/COAPS Climate Model

Outline:Outline:

I. Introduction to work

II. Background (Previous Studies)

III. Data

IV. Methodology

V. Results

VI. Summary/Conclusions

VII.Future Work

Page 3: Validation of the FSU/COAPS Climate Model

Introduction: The studyIntroduction: The study

• What I’m Doing…

– Scientific Objective:• Validating the FSU/COAPS Climate Model

– Variables:

» Surface Temperature (Maximum and Minimum)

» Precipitation

– Comparing:

» Model forecast vs. Observations

» Real-time runs (persisted SSTA) vs. Hindcast runs (observed weekly SSTA)

– Scientific Questions:1. How “well” does the FSU/COAPS climate model forecast

surface temperature and precipitation over the southeast United States?

2. How do forecasts made from persisted SSTA compare to hindcasts made from weekly updated prescribed SSTA?

Page 4: Validation of the FSU/COAPS Climate Model

Introduction: Reasons for studyIntroduction: Reasons for study

•And Why…

– Benefit to society:• Crop models can assist decision makers in minimizing loses and maximizing profits

in the agricultural community (Hansen et al., 1997)

– Benefit to the science:• Evaluation of the need for a coupled ocean-atmosphere climate model

• Validation potentially leads to improved climate models/forecasts

Page 5: Validation of the FSU/COAPS Climate Model

Previous Studies: Persisted vs. prescribed SSTAPrevious Studies: Persisted vs. prescribed SSTA

– Why use persisted anomalies for climate forecasts?• Currently, coupled models do not out perform predictions given by persisted SSTA at lead

times less than 3 months (exception: ENSO events)

– Why compare real-time runs to hindcasts runs?• Overestimates skill, but evaluates model’s possible potential

• Determines how much skill is lost from using persisted SSTA

– Performance of real-time vs. hindcasts runs?• Similar values of skill, unless large SSTA errors

(Goddard and Mason, 2002)

Page 6: Validation of the FSU/COAPS Climate Model

Previous Studies: SST and the southeast climatePrevious Studies: SST and the southeast climate

• SOI and Southeast United States El Nino Signals (Ropelewski and Halpert, 1986)

– Consistent signals:

• Above normal precipitation (October-March)

• Below normal temperature

– Physical mechanisms causing precipitation and temperature signals (teleconnections vs. direct link)

• Tropical Pacific SSTA and ENSO Signals (Montroy, 1997)

– Strongest signals: Precipitation in the Southeast US• Positive: November - March • Negative: July-August

• Teleconnections (Gershunov and Barnett, 1997)

– Strong teleconnection between tropical Pacific SSTs and wintertime precipitation over the coastal southeast US (negatively correlated in the interior states)

Page 7: Validation of the FSU/COAPS Climate Model

Previous Studies: The modelPrevious Studies: The model

The FSU/COAPS Climate Model:

• Large Scale Features:

– Capable of reproducing large scale features associated with ENSO (Cocke and LaRow, 2000)

– Capable of reproducing teleconnections response (Cocke et al., 2007)

• Temperature and Precipitation:

– Regional and global model are both in “reasonable” agreement with observations (Cocke and LaRow, 2000)

– Alterations to the physics of the model resulted in a strong wet bias and cold surface temperature bias, upgrading the model (inclusion of the NCAR CLM2 as land parameterization) reduced biases (Shin et al., 2004)

– Regional model rainfall events: • Over predicts small rainfall events

• “Accurately” produces higher magnitude rainfall events (Cocke et al., 2007)

– “Seasonal variations, anomalies, and changes in extreme daily events for maximum temperature are successfully forecasted” (Lim et al., 2007)

Page 8: Validation of the FSU/COAPS Climate Model

Previous Studies: SummaryPrevious Studies: Summary

•What to expect in my study (Hypothesis):

– Model Forecast vs. Observations:• Model forecasts will show a wet bias in precipitation and cold bias in maximum surface

temperature (unless bias corrected)

• Forecasts should show “reasonable” skill (anomalies, seasonal variations, various skill scores)

• Forecasts will tend to over predict the number of small rainfall events

– Real-Time Runs vs. Hindcast Runs (Persisted vs. Prescribed SSTA )• In general, hindcast runs will out perform real-time runs

• Real-time runs should be comparable to hindcasts runs when equatorial Pacific SSTA remain fairly constant throughout the duration of the forecast

• Loss of skill in the real-time runs will occur– When equatorial Pacific SSTA take a sharp change– Where areas are greatly influenced by SSTs (high teleconnection signal)

• Skill of real-time runs will decrease after 3 months

Page 9: Validation of the FSU/COAPS Climate Model

Data: Model Data: Model

•Details of model (Spectral Model):

– Resolution:• GSM: T63 (1.875˚ x 1.875˚)• NRSM: 20 x 20 km

– Bias: • Precipitation: wet• Max Temperature: cold

•Real time run:

– Persisted SSTA– Model Runs (5 ensembles):

• January 2005 and 2006 (3 month forecasts)

•Hindcast runs:

– Prescribed SSTA (updated weekly)– Model Runs (5 ensembles):

• October 2004 and 2005 (6 month forecasts)

– Model Climatology (20 ensembles):• 1987-2005

FSU/COAPS Climate Model

Model Details

Page 10: Validation of the FSU/COAPS Climate Model

Data: ObservationsData: Observations

COOP Observation Network:

• Daily/Monthly Temperature and Precipitation (January 1986-June 2007)

• Station to grid:

– 172 long term stations (1948-present)

– For each grid point…

• Finds the closest stations (using the great circle distance)

• Assigns the station’s temperature and precipitation to that grid

Page 11: Validation of the FSU/COAPS Climate Model

MethodologyMethodology

•In General:

– Looking for the ability to forecast:

• Correct anomaly• Extreme events

– Heavy Precipitation/Drought– Freezing Temperatures

•Today’s Talk (Temperature): ability to forecast:

– 1. Average monthly temperature anomaly

– 2. Max temperature distributions (bias corrected)*

– 3. Freeze events (bias corrected)*

*bias corrected:

where:

yclimatolog observed

oyclimatologmodel

etemperaturforecasted

o

m

f

T

T

T

Page 12: Validation of the FSU/COAPS Climate Model

Comparing JFM 05 vs JFM 06 SSTAComparing JFM 05 vs JFM 06 SSTA

2005 2006

2005: SSTA switch

from warm to cool

2006: SSTA remain

fairly constant

(figures provided by: http://www.pmel.noaa.gov/tao/jsdisplay/ )

Page 13: Validation of the FSU/COAPS Climate Model

Results: Ability to forecast anomaliesResults: Ability to forecast anomalies

Real-Time Hindcast Real-Time Hindcast

Jan.

Feb.

Mar.

2005 2006

Error in Monthly Temperature Anomalies (˚C):

Initial time: Jan. 05 Oct. 05 Jan. 06 Oct. 06

Error in Temperature

(ε = f ` - o`)

•2006 real-time runs

resulted in less error than

the 2005 real-time runs

•2005 and 2006 hindcasts

runs showed

approximately the same

magnitude of error

•In 2006 the real-time runs

and hindcast runs resulted

in “similar” forecast error

|ε| = 4.28

|ε| = 2.50

|ε| = 0.77|ε| = 3.14

|ε| = 4.86

|ε| = 2.72

|ε| = 0.87

|ε| = 1.96 |ε| = 2.79

|ε| = 0.95

|ε| = 1.40|ε| = 2.65

Page 14: Validation of the FSU/COAPS Climate Model

Results: Ability to forecast distributionsResults: Ability to forecast distributions

Daily Maximum Temperatures

(January-March)

•Observed distributions more

accurately captured by the

models in 2006

• Hindcasts vs. real-time runs

differ more in 2005 than in 2006

•In both years, model over

estimates tails of the distribution

Page 15: Validation of the FSU/COAPS Climate Model

Results: Ability to forecast extreme eventsResults: Ability to forecast extreme events

Deep Freeze Events

(January – March)

• Model tends to over predicts

number of freeze events

• Real-time and hindcasts

runs are more similar in 2006

Observed

Real-Time

Hindcast

2005 200620052006

Tmin < -2˚C Tmin < -4˚C

Page 16: Validation of the FSU/COAPS Climate Model

Summary/Conclusions:Summary/Conclusions:

– Model forecasts show a cold bias in surface temperature

– The model tends to over predict “extreme” events

– Real-time runs resembled hindcasts runs when equatorial Pacific

SSTA remain fairly constant throughout the duration of the forecast (as

in 2006)

– When SSTA changed during the time period of the forecast (as in

2005), hindcasts runs outperform real-time runs even at lead times of 3

months

Page 17: Validation of the FSU/COAPS Climate Model

Future Work:Future Work:

– Evaluate Real-Time October runs:• Currently comparing Januray real-time runs (0 month lead time) to October

hindcast runs (3 month lead time)

– Examine other verification methods/skill scores:• Anomaly correlations• Taylor diagrams• Relative Operating Characteristics (ROC)• Heidke Skill Score (HSS)

– Further investigate SSTA in the tropical Pacific and their effects on model forecast errors

Page 18: Validation of the FSU/COAPS Climate Model

Questions?Questions?

Page 19: Validation of the FSU/COAPS Climate Model

References:References:

Cocke, S., and LaRow, T.E., 2000: Seasonal predictions using a regional spectral model embedded within a coupled ocean-atmosphere

model. Mon. Wea. Rev., 128, 689-708.

Goddard, L, S.J. Mason, S.E. Zebiak, C.F. Ropelewski, R. Basher, and M.A. Cane. 2001. Current approaches to seasonal-to-interannual

climate predictions. International Journal of Climatology. 21: 1111-1152. DOI: 10.1002/joc.636

Goddard L, Mason SJ, 2002. Sensitivity of seasonal climate forecasts to persisted SST anomalies. Climate Dynamics 19: 619–631

Gershunov, A., 1998: ENSO influence on intraseasonal extreme rainfall and temperature frequencies in the contiguous United

States:Implications for long-range predictability. J. Climate, 11, 3192–3203..

Jolliffe, I.T., and D.B. Stephenson, 2003. Forecast Verification. Wiley, 240 pp.

Lim, Y.K., Shin, D.W., Cocke, S., LaRow, T.E., Schoof, J.T., O’Brien, J.J, and Chassignet, E., 2007: Dynamically and statistically

downscaled seasonal forecasts of maximum surface air temperature over the southeast United States.

Montroy, D., 1997: Linear relation of central and eastern North American precipitation to tropical Pacific sea surface temperature anomalies.

J. Climate, 10, 541–558..

Murphy, A.H., 1993. What is a good forecast? An essay on the nature of goodness in weather forecasting. Weather and Forecasting, 8,

281-293.

Ropelewski, C. F., and M. S. Halpert, 1986: North American precipitation and temperature patterns associated with the El Niño/Southern

Oscillation (ENSO). Mon. Wea. Rev., 114, 2352–2362.

Shin, D.W., S. Cocke, T. E. LaRow, and J. J. O’Brien (2005), Seasonal surface air temperature and precipitation in the FSU climate model

coupled to the CLM2, J. Clim., 18, 3217-3228.

Wilks, D.S., 1995. Statistical Methods in the Atmospheric Sciences. An Introduction. Sans Diego: Academic Press.

Wilks, D.S. and R.L. Wilby. 1999. The weather generation game: A review of stochastic weather models. Progress in Physical Geography

23:329-357.