overview of the wp5.3 activities

14
MBLES RT4/RT5 Joint Meeting Paris, 10-11 February Overview of the WP5.3 Activities Partners: ECMWF, METO/HC, MeteoSchweiz, KNMI, IfM, CNRM, UREAD/CGAM, CNRS/IPSL, BMRC, CERFACS

Upload: carsyn

Post on 22-Jan-2016

28 views

Category:

Documents


0 download

DESCRIPTION

Overview of the WP5.3 Activities. Partners: ECMWF, METO/HC, MeteoSchweiz, KNMI, IfM, CNRM, UREAD/CGAM, CNRS/IPSL, BMRC, CERFACS. Forecast quality assessment. Forecast quality assessment is a basic component of the prediction process. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Overview of the WP5.3 Activities

ENSEMBLES RT4/RT5 Joint Meeting Paris, 10-11 February 2005

Overview of the WP5.3 Activities

Partners: ECMWF, METO/HC, MeteoSchweiz, KNMI, IfM, CNRM, UREAD/CGAM, CNRS/IPSL, BMRC, CERFACS

Page 2: Overview of the WP5.3 Activities

ENSEMBLES RT4/RT5 Joint Meeting Paris, 10-11 February 2005

Forecast quality assessment

Forecast quality assessment is a basic component of the prediction process

Information about the quality and the uncertainty of the predictions is as important as the prediction

itself

Page 3: Overview of the WP5.3 Activities

ENSEMBLES RT4/RT5 Joint Meeting Paris, 10-11 February 2005

WP5.3 activitiesWP5.3: Assessment of s2d forecast quality

• Target “assessment of the actual and potential skill of the models and

the different versions of the multi-model ensemble system“

• Main tasks during the first 18 months: Assessment of the actual and potential skill of the different

ensemble prediction systems and sensitivity experiments, including a comparison with reference models (link WP4.4).

Estimate useful skill for end users in seasonal-to-decadal hindcasts to assess their potential economic value (link WP5.5).

Develop web-based verification technology (link WP2A.4).

Assessment of the skill in predicting rare events (link WP4.3 and WP5.4).

Other links: RT1, RT2A

Page 4: Overview of the WP5.3 Activities

ENSEMBLES RT4/RT5 Joint Meeting Paris, 10-11 February 2005

WP5.3 activitiesWP5.3: Assessment of s2d forecast quality

• First 18 month deliverables: 5.3 (UREAD/CGAM): Optimal statistical methods for

combining multi-model simulations to make probabilistic forecasts of rare extreme events

5.4 (UREAD/CGAM): Best methods for verifying probability forecasts of rare events

5.7 (ECMWF): Skill of seasonal NAO and PNA using multi-model seasonal integrations from DEMETER

• First 18 month milestone: M5.2 (KNMI): Prototype of an automatic system for forecast

quality assessment of seasonal-to-decadal hindcasts

• First 18 month activity: ECMWF (3), MeteoSchweiz (1), UREAD/CGAM (0), CNRS/IPSL (6), KNMI (0), METO/HC(0)

Page 5: Overview of the WP5.3 Activities

ENSEMBLES RT4/RT5 Joint Meeting Paris, 10-11 February 2005

WP5.3 action planWP5.3: Assessment of s2d forecast quality

• Two different types of verification activities: Automatic quality control

Research on verification

• Research verification requires efficient data dissemination: MARS, public server at ECMWF

Climate explorer

• Need of a probabilistic model before doing probabilistic verification

• Broad range of research studies, in close link with validation work in RT4 and RT5

• Verification based on the end-to-end approach

Page 6: Overview of the WP5.3 Activities

ENSEMBLES RT4/RT5 Joint Meeting Paris, 10-11 February 2005

Three-tier verification• Forecast quality needs to be assessed thoroughly also

for end-user predictions, but there is no direct relationship between forecast quality and usefulness.

• Use end-to-end approach: end-users develop prediction models taking into account prediction limitations.

• Forecast reliability becomes a major issue.• A three tier scheme can then be considered:

Tier 1: single meteorological variables are assessed against a reference prediction (climatology, persistence, …)

Tier 2: application model hindcasts driven by weather / climate predictions are assessed against an application model reference (e.g., driven by ERA-40); no reference to real world application

Tier 3: as in tier 2, but the application model hindcasts are assessed against observed data

Page 7: Overview of the WP5.3 Activities

ENSEMBLES RT4/RT5 Joint Meeting Paris, 10-11 February 2005

Automatic quality control• Most of the s2d

simulations run at ECMWF and have a common output

• Need checking asap the quality (units, missing files, wrong data…) of the hindcasts produced

• Verification suite running periodically with graphical output made available on the web

Page 8: Overview of the WP5.3 Activities

ENSEMBLES RT4/RT5 Joint Meeting Paris, 10-11 February 2005

KNMI Climate Explorer• An OPenDAP server

allows the Climate Explorer to automatically access the ENSEMBLES data with no local copy of the whole data set.

• The Climate Explorer performs correlations, basic probabilistic estimates, EOFs, plotting, etc.

• The capabilities of the Climate Explorer will be expanded to allow for more tier-1 skill measures, including verification of probability forecasts and rare events (end 2006).

Page 9: Overview of the WP5.3 Activities

ENSEMBLES RT4/RT5 Joint Meeting Paris, 10-11 February 2005

Climate explorerT2m point correlation for DEMETER 1-month lead multi-

model seasonal hindcasts (1959-2001)

From G. J. van Oldenborgh, KNMI

Page 10: Overview of the WP5.3 Activities

ENSEMBLES RT4/RT5 Joint Meeting Paris, 10-11 February 2005

L

obsfcst ppRPS

Tier-1 verificationExample: MeteoSwiss will work on the de-biased ranked

probability skill score RPSSd

• Conventional probabilistic skill scores based on the Brier score have a negative bias due to a finite ensemble size

• How to compare forecasts from systems with low or even different ensemble sizes?

From M. Liniger, MeteoSwiss

RPSS for unskilled (wrt climatology) forecasts

L=1

L=2

L=3

L=4

Ensemble Size

RP

SS L

L=1

L=2

L=3

L=4

Ensemble Size

RP

SS L

Müller, Appenzeller, Doblas-Reyes and Liniger, J. Clim., in press

Page 11: Overview of the WP5.3 Activities

ENSEMBLES RT4/RT5 Joint Meeting Paris, 10-11 February 2005

Tier-1 verificationExample: CNRS/IPSL will develop a tool based on the “local mode

analysis” to test the skill of the ISO in seasonal predictions (beg. 2006)

From J.-Ph. Duvel, CNRS/IPSL

-1.0

-0.5

0.0

0.5

1.0

1/03/00 1/04/00 1/05/00 1/06/00 1/07/00 1/08/00 1/09/00dat

FILTERED SIGNAL ERA40 CNRM CRFC SCWC LODY SCNR SMPI UKMO UKMU MEAN-1.0

-0.5

0.0

0.5

1.0

1/09/00 1/10/00 1/11/00 1/12/00 1/01/01 1/02/01 1/03/01dat

FILTERED SIGNAL ERA40 CNRM CRFC SCWC LODY SCNR SMPI UKMO UKMU MEAN

Start 1st Nov (MJO)

Start 1st May (monsoon breaks)Inter-annual correlation between simulated and observed OLR

intraseasonal variance (90 day time section, 1 correlation every 5 days, 22 years) over the tropical Indian Ocean

Page 12: Overview of the WP5.3 Activities

ENSEMBLES RT4/RT5 Joint Meeting Paris, 10-11 February 2005

France Germany

Denmark Greece

Tier-3 verification

From P. Cantelaube and J.-M. Terres, JRC

SIMULATION WEIGHTED YIELD ERROR (%)

± STANDARD ERROR

JRC February 7.1 ± 0.9

JRC April 7.7 ± 0.5

JRC June 7.0 ± 0.6

JRC August 5.4 ± 0.5

DEMETER (Feb. start)

6.0 ± 0.4

DEMETER multi-model predictions (7 models, 63 members, Feb starts) of average wheat yield for four European countries (box-

and-whiskers) compared to Eurostat official yields (black horizontal lines) and crop results from a simulation forced with downscaled

ERA40 data (red dots).

Page 13: Overview of the WP5.3 Activities

ENSEMBLES RT4/RT5 Joint Meeting Paris, 10-11 February 2005

Questions?

Page 14: Overview of the WP5.3 Activities

ENSEMBLES RT4/RT5 Joint Meeting Paris, 10-11 February 2005

A service that offers immediate and free access to data from:•DEMETER•ERA-40•ERA-15•ENACTwith monthly and daily data, select area and plotting facilities, GRIB or NetCDF formats

Data disseminationDifferent depending on access granted to ECMWF

systems: access: MARS http://www.ecmwf.int/services/archive/ no access: public data server and OPenDAP (DODS) server