Г.С.Ривин

Download Г.С.Ривин

If you can't read please download the document

Upload: august

Post on 09-Jan-2016

37 views

Category:

Documents


0 download

DESCRIPTION

Томск 16-25.07.2004. Современные вычислительные технологии в задачах прогноза погоды. Г.С.Ривин. [email protected]. www.ict.nsc.ru. ATMOSPHERIC PROCESSES in SPACE-ATMOSPHERE-SEA/LAND system. Submodels. WMO WEATHER FORECASTING RANGES. COMPONENTS of FORECAST SYSTEM. - PowerPoint PPT Presentation

TRANSCRIPT

  • ATMOSPHERIC PROCESSES in SPACE-ATMOSPHERE-SEA/LAND system

  • Submodels

  • WMO WEATHER FORECASTING RANGES

  • SYNOPAIROLOGICAL DATAAIROCRAFS

  • Radar

    RADARS,r = 100km

  • The system of equations (conservation laws applied to individual parcels of air) (from E.Kalnay)

    conservation of the 3-dimensional momentum (equations of motion), conservation of dry air mass (continuity equation), the equation of state for perfect gases,conservation of energy (first law of thermodynamics), equations for the conservation of moisture in all its phases.

    They include in their solution fast gravity and sound waves, and therefore in their space and time discretization they require the use of smaller time steps, or alternative techniques that slow them down. For models with a horizontal grid size larger than 10 km, it is customary to replace the vertical component of the equation of motion with its hydrostatic approximation, in which the vertical acceleration is neglected compared with gravitational acceleration (buoyancy). With this approximation, it is convenient to use atmospheric pressure, instead of height, as a vertical coordinate. V. Bjerknes (1904) pointed out for the first time that there is a complete set of 7 equations with 7 unknowns that governs the evolution of the atmosphere:

  • ECMWF: T511L60 40 km; EPS: T255L60 80 km; DWD: GME (L41) 40 km; LM (L3550) (2.8)7 km; France: ARPEGE(L41)-23-133km; ALADIN (L41) 9 km;HIRLAM: -------------- (L16-31) 5-55 km;UK: UM(L30) 60 km; (L38) 12 km; USA: AVP (T254L64) 60 km; ETA (L60) 12 km; Japan: GSM(L40) 60 km; MSM(L40) 10 km.RusFed.: T85L31 150 km; (L31) 75 km. Moscow region (300kmx300km) - 10 km.

    2003, December

  • Coordinate systems: p, sigma, z, eta, hybrid Models of atmosphere: Steps: global - 40-60 km, local 7-12 km; Methods: splitting, semi-Lagrangian scheme (23), ensembles, nonhydrostatic, grids Data assimilation: 3(4)D-Var, Kalman filter Reanalyses NCEP / NCAR USA 50-years (1948-; T62L28~210km) Reanalyses-2 (ETA RR 32 km, 45 layers) ECMWF ERA-15 (TL106L31~150km, 1979-1993), ERA-40 (TL159L60~120km, 3D-Var, mid1957-2001)

    FEATURES OF INFORMATION AND COMPUTATIONAL TECHNOLOGIES IN ATMOSPHERIC SCIENCES

  • Modern and Possible further development computational technologies ensemble simulation

    One method (which used by ECMWF forecast system) based on the finding

    with help of the part of the eigenvectors of the linear operator

    which received after linearization of the operator N from finite-difference scheme of the system of the using forecasting thermo- hydrodinamic equations

    ,

    where

    is grid vector-function

    , other notations in this formula are usual.

    Plus of this method is good physical meaning but minus consist in first of all in necessary finding eigenvectors of the linearization L and then barest necessity of the making sufficiently big quality of the additional forecasts.

    _1084362113.unknown

    _1084362161.unknown

    _1084690347.unknown

    _1084362128.unknown

    _1084347172.unknown

  • ECMWF: FORECASTING SYSTEM - DECEMBER 2003

    Model:Smallest half-wavelength resolved: 40 km (triangular spectral truncation 511)Vertical grid: 60 hybrid levels (top pressure: 10 Pa)Time-step: 15 minutesNumerical scheme: Semi-Lagrangian, semi- implicit time-stepping formulation.Number of grid points in model: 20,911,680 upper-air, 1,394,112 in land surface and sub- surface layers. The grid for computation of physical processes is a reduced, linear Gaussian grid, on which single- level parameters are available. The grid spacing is close to 40km.Variables at each grid point (recalculated at each time-step):Wind, temperature, humidity, cloud fraction and water/ ice content, ozone content(also pressure at surface grid-points)Physics: orography (terrain height and sub-grid-scale), drainage, precipitation, temperature, ground humidity, snow-fall, snow-cover & snow melt, radiation (incoming short-wave and out-going long-wave), friction (at surface and in free atmosphere), sub-grid-scale orographic drag - gravity waves and blocking effects, evaporation, sensible & latent heat flux, oceanic waves.

  • ECMWF: FORECASTING SYSTEM - DECEMBER 2003

    Data Assimilation: Analysis: Mass & wind (four-dimensional variational multi- variate analysis on 60 model levels) Humidity (four-dimensional variational analysis on model levels up to 250 hPa) Surface parameters (sea surface temperature from NCEP Washington analysis, sea ice from SSM/I satellite data), soil water content, snow depth, and screen level temperature and humidityData used: Global satellite data (SATOB/AMV, (A)TOVS, Quikscat, SSM/I, SBUV, GOME, Meteosat7 WV radiance), Global free-atmosphere data (AIREP, AMDAR, TEMP, PILOT, TEMP/DROP, PROFILERS), Oceanic data (SYNOP/SHIP, PILOT/SHIP, TEMP/SHIP, DRIBU), Land data (SYNOP). Data checking and validation is applied to each parameter used. Thinning procedures are applied when observations are redundant at the model scale.

  • the Penn State/NCAR Mesoscale Model (e.g., Dudhia, 1993),

    the CAPS Advanced Regional prediction System (Xue et al, 1995),

    NCEP's Regional Spectral Model (Juang et al, 1997),

    the Mesoscale Compressible Community (MCC) model (Laprise et al, 1997),

    the CSU RAMS Tripoli and Cotton (1980),

    the US Navy COAMPS (Hodur, 1997).

  • WRF Development TeamsCourtesy NCARWG1WG2WG3WG4WG10WG7WG6WG13WG5WG8WG11WG14WG12WG9WG15WG16

    Sheet4

    Sheet5

    Numerics and Software (J. Klemp)Data Assimilation (T. Schlatter)Analysis and Validation (K. Droegemeier)Community Involvement (W. Kuo)Operational Implementation (G. DiMego)

    Working GroupsDynamic Model Numerics (W. Skamarock)Standard Initialization (J. McGinley)Analysis and Visualization (L. Wicker)Workshops, Distribution, and Support (J. Dudhia)Data Handling and Archive (G. DiMego)

    Model Physics (J. Brown)

    3-D Var (J. Derber)Model Testing and Verification (C. Davis)Operational Requirements (G. DiMego)

    Software Architecture, Standards, and Implementation (J. Michalakes)Atmospheric Chemistry (P. Hess)

    Land Surface Models (J. Wegiel)Operational Forecaster Training (T. Spangler)

    Advanced Techniques (D. Barker)Ensemble Forecasting (D. Stensrud)

    Regional Climate Modeling (proposed)

    Sheet1

    Code developmentTesting

    Framwork and module interfacesParallelizationI/O and initializationComputational componentsMeta-computing/GridsPerformance and ScalingValidation of results

    Depends on (external)1st cut of physics/dynamics from WRF groupsCases

    Depends on (internal)Model Initialization

    Resources (external to WRF)DOE: ASCI, CCANASA: HPCCNCAR/SCDArgonne NLNCAR/SCDNCSA/HDFUnidataNCSCNCSA/PACINCSA

    WRF Resources (external to CHSSI)NCAR/MMMNCAR/MMMNOAA/FSLNCAR/MMMNOAA/FSLNCAR/MMMNOAA/FSLAFWA (postdoc)NCAR/MMMNOAA/FSL

    Resources (internal to CHSSI)NCAR/MMMAerospace Corp.NCAR/MMMAerospace Corp.NCAR/MMMAerospace Corp.NCAR/MMMAerospace Corp.Aerospace Corp.NCAR/MMMAerospace Corp.NCAR/MMM

    MonthFramwork and module interfacesI/O and initializationDynamicsPhysicsNestingParallelization, Performance, and ScalingValidation

    0Prototype parallel driver,RegistryIdealizedLeapfrog,Runge KuttaSimple advection of moisture tracersPorted to DMP, SMP, Hybrid

    3Design I/O API

    6

    9Semi-implicit

    12

    15

    18

    Alpha version ready for testing

    21Compact differencing

    24

    27

    30

    Beta version ready for testing

    33

    36

    Sheet2

    MonthFramework and module interfacesModel initializationDynamicsPhysicsParallelization, Performance, and ScalingI/O and data formatsValidation

    0Prototype parallel driver,Automatic data registryIdealized2-level time-split;Low-order advection; Height coordsSimple moistureDM/SM parallelNo input,Makeshift outputTested against idealized case

    3Computational frameworks;Extend registry to WRF IDE;Design nestingReal data initializationMass coords;High-order advect; SLTDetermine initial physics setParallelize new components;Additional parallel libraries;Benchmarking and optimizationI/O API and scalable history, check/restart; Metadata format

    6Implement initial physics set

    9Comparative evaluationTesting with initial physicsBegin validation

    12

    15

    18

    ALPHAAlpha driver layer and WRF IDEReal data initializationBest schemeFirst set of physics optionsFully parallel on first CHSSI platformScalable I/O on primary system; Metadata

    21Implement WRF using off-shelf Computational Framework package; Nesting3DVAR, 4DVARSemi-implicit;Second schemeAdditional physics, chemistryContinue benchmarking, optimization;Extend to other platformsExtend WRF to other I/O and metadata packagesCase validation, physics tuning, refinement

    24

    27

    30

    BETAPerformance Portable WRF

    33Final documentation, testing, support; first WRF Users' Workshop

    36

    Sheet3

    Code developmentTesting

    Framwork and module interfacesParallelizationI/O and initializationComputational componentsMeta-computing/GridsPerformance and ScalingValidation of results

    Resources (internal to CHSSI)NCAR/MMMAerospace Corp.NCAR/MMMAerospace Corp.NCAR/MMMAerospace Corp.NCAR/MMMAerospace Corp.Aerospace Corp.NCAR/MMMAerospace Corp.NCAR/MMMAFWA

    WRF Resources (external to CHSSI)NCAR/MMMNCAR/MMMNOAA/FSLNCAR/MMMNOAA/FSLNCAR/MMMNOAA/FSLAFWA (postdoc)NCAR/MMMNOAA/FSL

    Resources (external to WRF)DOE: ASCI, CCANASA: HPCCNCAR/SCDArgonne NLNCAR/SCDNCSA/HDFUnidataNCSCNCSA/PACINCSAOther Alpha users

  • Model Physics in High Resolution NWPPhysicsNo Mans LandFrom Joe Klemp, NCAR (Bad Orb, 23-27.10.03 2003)

  • Weather Research and Forecasting ModelGoals: Develop an advanced mesoscale forecast and assimilation system, and accelerate research advances into operations Collaborative partnership, principally among NCAR, NOAA, DoD, OU/CAPS, FAA, and university community Multi-agency WRF governance; development conducted by 15 WRF Working Groups Software framework provides portable, scalable code with plug-compatible modules Ongoing active testing and rapidly growing community use Over 1,400 registered community users, annual workshops and tutorials for research community Daily experimental real-time forecasting at NCAR , NCEP, NSSL, FSL, AFWA, U. of Illinois Operational implementation at NCEP and AFWA in FY04From Joe Klemp, NCAR (Bad Orb, 23-27.10.03 2003)

  • Hurricane Isabel NOAA 17 AVHRR 13 Sep 03 14:48 GMTFrom Joe Klemp, NCAR (Bad Orb, 23-27.10.03 2003)

  • Hurricane Isabel Track18/1700Z10 km WRFInitialized 15/1200Z4 km WRFInitialized 17/0000ZFrom Joe Klemp, NCAR (Bad Orb, 23-27.10.03 2003)

  • Hurricane Isabel 3 h Precip ForecastInitialized:12 UTC 15 Sep 03WRF Model10 km grid5 day forecastFrom Joe Klemp, NCAR (Bad Orb, 23-27.10.03 2003)

  • 48 h Hurricane Isabel Reflectivity Forecast4 km WRF forecastRadar CompositeInitialized 00 UTC 17 Sep 03From Joe Klemp, NCAR (Bad Orb, 23-27.10.03 2003)

  • Hurricane Isabel Reflectivity at LandfallRadar Composite18 Sep 2003 1700 Z41 h forecast from 4 km WRFFrom Joe Klemp, NCAR (Bad Orb, 23-27.10.03 2003)

  • Hurricane Isabel Surface-Wind ForecastInitialized:00 UTC 17 Sep 03WRF Model4 km grid2 day forecastFrom Joe Klemp, NCAR (Bad Orb, 23-27.10.03 2003)

  • Terrain-following hydrostatic pressure vertical coordinateArakawa C-grid, two-way interacting nested grids (soon)3rd order Runge-Kutta split-explicit time differencingConserves mass, momentum, dry entropy, and scalars using flux form prognostic equations5th order upwind or 6th order centered differencing for advectionPhysics for CR applications: Lin microphysics, YSU PBL, OSU/MM5 LSM, Dudhia shortwave/RRTM longwave radiation, no cumulus parameterizationWRF Mass Coordinate CoreFrom Joe Klemp, NCAR (Bad Orb, 23-27.10.03 2003)

  • Model Configuration for 4 km Grid Domain 2000 x 2000 km, 501 x 501 grid 50 mb top, 35 levels 24 s time step Initialization Interpolated from gridded analyses BAMEX: 40 km Eta CONUS analysis Isabel: 1o GFS global analysis (~110 km) Computing requirements 128 Processors on IBM SP Power 4 Regatta Run time: 106 min/24h of forecast

    From Joe Klemp, NCAR (Bad Orb, 23-27.10.03 2003)

  • North American Early Guidance System

  • Global Forecast System (GFS)

  • Timeline for WRF at NCEPNorth American WRF: Operational in FY05Hurricane WRF: Operational in FY06Rapid Refresh (RUC) WRF (hourly): Operational in FY07WRF SREF : Operational in FY07Others? (Fire Wx, Homeland Security, etc.) using best WRF deterministic model

  • The Unified Model The Unified Model is the name given to the suite of atmospheric and oceanic numerical modelling software developed and used at the Met Office. The formulation of the model supports global and regional domains and is applicable to a wide range of temporal and spatial scales that allow it to be used for both numerical weather prediction and climate modelling as well as a variety of related research activities. The Unified Model was introduced into operational service in 1992. Since then, both its formulation and capabilities have been substantially enhanced. New Dynamics A major upgrade to the Met Office Global Numerical Weather Prediction model was implemented on 7th August 2002.

    Submodels The Unified Model is made up of a number of numerical submodels representing different aspects of the earth's environment that influence the weather and climate. Like all coupled models the Unified Model can be split up in a number of different ways, with various submodel components switched on or off for a specific modelling application. The Portable Unified Model (PUM) A portable version of the Unified Model has also been developed suitable for running on workstations and other computer systems.

    http://www.metoffice.com/research/nwp/numerical/unified_model/index.html

  • The Met Office Global Numerical Weather Prediction model was implemented on 7th August 2002. The package of changes was under trial for over a year and is known as "New Dynamics". This document details some of the key changes that are part of the New Dynamics package. Non-hydrostatic model with height as the vertical co-ordinate. Charney-Philips grid-staggering in the vertical, Arakawa C-grid staggering in the horizontal, Two time-level, semi-Lagrangian advection and semi-implicit time stepping. Edwards-Slingo radiation scheme with non-spherical ice spectral files Large-scale precipitation includes prognostic ice microphysics. Vertical gradient area large-scale cloud scheme. Convection with convective available potential energy (CAPE) closure, momentum transports and convective anvils. A boundary-layer scheme which is non-local in unstable regimes. Gravity-wave drag scheme which includes flow blocking. GLOBE orography dataset. The MOSES (Met Office Surface Exchange Scheme) surface hydrology and soil model scheme.Predictor-corrector technique with no extraction of basic state profile. Three-dimensional Helmholtz-type equation solved using GCR technique.http://www.metoffice.com/research/nwp/numerical/unified_model/new_dynamics.html

  • The operational forecast system at Mto-France is based on two different numerical applications of the same code 1. ARPEGE-IFS, 2. additional code to build the limited area model ALADIN.

    The ARPEGE-IFS has been developed jointly by Mto-France and ECMWF (ARPEGE is the usual name in Toulouse and IFS - in Reading): ECMWF model for medium range forecasts (4-7 days) a Toulouse variable mesh version in for short range predictions (1-4 days)The ALADIN library has been developed jointly by Mto-France and the national meteorological or 14 hydrometeorological services: Austria, Belgium, Bulgaria, Croatia, Czech Republic, Hungary,Moldova, Morocco, Poland, Portugal, Romania, Slovakia,Slovenia, Tunisia.

  • 35 325 32540(35)

  • the hydrostatic model ,

    41(31) layers and horizontal resolution ~ 40(60) km,

    prognostic equations: horizontal wind components, temperature, specific humidity, specific cloud water content and surface pressure,

    physical processes: a comprehensive representation of the precipitation process, a massflux convection parameterisation, a radiation model with cloud-radiation interaction, turbulent exchange in the free atmosphere based on a level 2 scheme, surface layer fluxes based on a bulk approach,a two layer soil model including energy and mass budget equations for snow cover and the representation of sub-grid scale orographic effects,

    the topography of the earth's surface.

  • nonhydrostatic model,

    resolution ~ 2.8 (7) km,

    GME + 3 additional prognostic equations:vertical wind speed,pressure deviation,turbulent kinetic energy (TKE),

    the vertical turbulent diffusion (2.5 scheme),a laminar sublayer at the earth's surface.

  • Forecast variables

    Data supply from DWDs

    LM or GME forecast models

    Numerical scheme

    EulerCauchy with iteration

    Interpolation

    1st order in time,

    2nd or 3rd order in space.

    _1095856474.unknown

    _1097079621.unknown

  • Daily routine (ca. 1500 trajectories)

    1. LM trajectories (7 km, Central and western Europe):

    48h forward trajectories for 36 nuclear and chemical installations.

    2. GME trajectories (60km resolution, worldwide):

    120h forward trajectories for 60 European nuclear sites,

    120h backward trajectories for 37 German radioactivity measuring sites,

    backward trajectories for the international GAW stations,

    backward trajectories for 5 African cities in the METEOSATMDD program, disseminated daily via satellite from Bracknell,

    backward trajectories for the German research polar stations Neumayer (Antarctica) and Koldewey (Spitzbergen) and the research ships 'Polarstern' and 'Meteor'.

    Operational emergency trajectory system

    (Trajectory system for scientific investigations)

    1. LM or GME trajectory models

    2. Data supply from LM or GME forecasts or analyses from current database or archives

    3. Foreward and backward trajectories for a choice of offered or freely eligible stations at optional heights and times in the current time period of 7 12 days

    4. Interactive menue to be executed by forecasters, operational 24h.

  • Further Development of the Local Systems LME and LMK 2003 to 2006LME: Local model LM for whole of Europe; mesh size 7 km and 40 layers; 78-h forecasts from 00, 12 and 18 UTC. LMK: LM-Krzestfrist; mesh size < 3 km and 50 layers; 18-h forecasts from 00, 03, 06, 09, 12, 15, 18, 21 UTC for Germany with explicit prediction of deep convection.

    1.Data assimilation

    2 Q 2005Use satellite (GPS) and radar data (reflectivity, VAD winds)

    1 Q 2006Use European wind profiler and satellite data

  • Further Development of the Local Systems LME and LMK 2003 to 20062.Local modelling

    2 Q 2004Increase model domain (7 km mesh) from 325x325 up to 753x641 gridpoints (covering whole of Europe), 40 layers

    3 Q 2005New convection scheme (Kain-Fritsch ?)

  • Europa

  • LMK: LM-KrzestfristModel-based system for nowcasting and very short range forecasting

    Goals:Prediction of severe weather on the mesoscale.Explicit simulation of deep convection.Method:18-h predictions of LM initialised every three hours, mesh size < 3 km

    Usage of new observations:SYNOP:Every 60 min,METAR:Every 30 min,GPS:Every 30 min,VAD winds:Every 15 min,Reflectivity:Every 15 min,Wind profiler:Every 10 min,Aircraft data.

  • 0003(UTC)00211815120906+3h+6h+9h+12h+18h+15hLMK: A new 18-h forecast every three hours

  • High-resolution Regional Model HRMOperational NWP Model at 13 services worldwideHydrostatic, (rotated) latitude/longitude grid Operators of second order accuracy7 to 28 km mesh size, various domain sizes20 to 35 layers (hybrid, sigma/pressure)Prognostic variables: ps, u, v, T, qv, qc, qiSame physics package as GMEProgramming: Fortran90, OpenMP/MPI for parallelizationFrom 00 and 12 UTC: Forecasts up to 78 hoursLat. bound. cond. from GME at 3-hourly intervals

  • General structure of a regional NWP system Topographical dataInitial data (analysis)Lateral boundary dataRegionalNWPModelDirect modeloutput (DMO)GraphicsVisualizationMOSKalmanApplications Wave model,TrajectoriesVerificationDiagnostics

  • Short Description of the High-Resolution Regional Model (HRM)

    Hydrostatic limited-area meso- and meso- scale numerical weather prediction model

    Prognostic variablesSurface pressure psTemperatureTWater vapourqvCloud waterqcCloud iceqiHorizontal windu, vSeveral surface/soil parameters Diagnostic variables

    Vertical velocityGeopotentialCloud coverclcDiffusion coefficientstkvm/h

  • Current operational users of the HRMBrazil, Directorate of Hydrography & NavigationBrazil, Instituto Nacional de MeteorologiaBulgaria, National Meteoro-logical & Hydrological ServiceChina, Guangzhou Regional Meteorological CentreIndia, Space Physics Lab.Israel, Israel Meteorological ServiceItaly, Italian Meteorological Service

    Kenya, National Meteorological ServiceOman, National Meteoro-logical Service (DGCAM) Romania, National Meteoro-logical & Hydrological ServiceSpain, National Met. InstituteUnited Arab Emirates, National Met. InstituteVietnam, National Meteoro-logical & Hydrological Service; Hanoi University

  • Numerics of the HRMRegular or rotated latitude/longitude gridMesh sizes between 0.25 and 0.05 (~ 28 to 6 km)Arakawa C-grid, second order centered differencingHybrid vertical coordinate, 20 to 35 layersSplit semi-implicit time stepping; t = 150s at = 0.25Lateral boundary formulation due to DaviesRadiative upper boundary condition as an optionFourth-order horizontal diffusion, slope correctionAdiabatic implicit nonlinear normal mode initialization

  • Physical parameterizations of the HRM-two stream radiation scheme (Ritter and Geleyn, 1992) including long- and shortwave fluxes in the atmosphere and at the surface; full cloud - radiation feedback; diagnostic derivation of partial cloud cover (rel. hum. and convection)Grid-scale precipitation scheme including parameterized cloud microphysics (Doms and Schttler, 1997)Mass flux convection scheme (Tiedtke, 1989) differentiating between deep, shallow and mid-level convectionLevel-2 scheme of vertical diffusion in the atmosphere, similarity theory (Louis, 1979) at the surfaceTwo-layer soil model including snow and interception storage; three-layer version for soil moisture as an option

  • Computational aspects of the HRMFortran 90 and C (only for Input/Output: GRIB code)Multi-tasking for shared memory computers based on standard Open-MPEfficient dynamic memory allocationNAMELIST variables for control of modelComputational cost: ~ 3100 Flop per grid point, layer and time stepInterface to data of the global model GME available providing initial and/or lateral boundary dataBuild-in diagnostics of physical processesDetailed print-out of meteographs

  • Diagramm1

    242.573

    128.001

    97.5053333333

    73.1866666667

    62.9333333333

    53.4835

    47.6326666667

    42.865

    Total Wallclock time (min)

    nproc

    t(min)

    Total wallclock time (min for 24 h) for HRM - Africa (361x321, 31 layers, 28 km) on an IBM RS/6000 SP

    Tabelle1

    nprocTotal Wallclock time (s)Total Wallclock time (min)

    214554.38242.57Area25 w65e

    47680.06128.0040 s40n

    65850.3297.51

    84391.2073.19

    103776.0062.93

    123209.0153.48

    142857.9647.63

    162571.9042.87

    instromintation for 16 processors

    Start up of HRM2.4

    l.b.c. update4.3

    Diabatic processes39.5

    Explicit forecast25.2

    SI Scheme7.9

    Asselin filtering4.1

    Condensation/evaporation3.2

    Diagnostics/meteographs3.6

    Post-processing GRIB files9.8

    Tabelle1

    Total Wallclock time (min)

    nproc

    t(min)

    Total wallclock time (min for 24 h) for HRM - Africa (361x321, 31 layers, 28 km) on an IBM RS/6000 SP

    Tabelle2

    Time distribtion (%) of the main processes of HRM on 16 processors of an IBM RS/6000 SP

    Tabelle3

  • Diagramm1

    2.4

    4.3

    39.5

    25.2

    7.9

    4.1

    3.2

    3.6

    9.8

    Time distribution (%) of the main processes of HRM on an IBM RS/6000 SP

    Tabelle1

    nprocTotal Wallclock time (s)Total Wallclock time (min)

    214554.38242.57Area25 w65e

    47680.06128.0040 s40n

    65850.3297.51

    84391.2073.19

    103776.0062.93

    123209.0153.48

    142857.9647.63

    162571.9042.87

    instromintation for 16 processors

    Start up of HRM2.4

    l.b.c. update4.3

    Diabatic processes39.5

    Explicit forecast25.2

    SI Scheme7.9

    Asselin filtering4.1

    Condensation/evaporation3.2

    Diagnostics/meteographs3.6

    Post-processing GRIB files9.8

    Tabelle1

    0

    0

    0

    0

    0

    0

    0

    0

    Total Wallclock time (min)

    nproc

    t(min)

    Total wallclock time (min for 24 h) for HRM - Africa (361x321, 31 layers, 28 km) on an IBM RS/6000 SP

    Tabelle2

    0

    0

    0

    0

    0

    0

    0

    0

    0

    Time distribution (%) of the main processes of HRM on of an IBM RS/6000 SP

    Tabelle3

  • Further Development of the HRM 2003 to 2006

    An MPI version of HRM for Linux PC Clusters, developed by Vietnam, is available to all HRM users since July 2003.

    A 3D-Var data assimilation scheme developed by Italy will be available to experienced HRM users early 2004.

    The physics packages in GME and HRM will remain exactly the same.

    The interaction between the different HRM groups should be intensified.

    A first HRM Users Meeting will take place in Rio de Janeiro (Brazil) in October 2004.

  • Univ LancasterUniv. BristolECMWFWL|Delft,RIZASMHIJRC IspraUniv. BolognaDWDDMIGRDC

  • 1) Run the complete assimilation-forecast system for GME and LM for the three historical flood events for a period of roughly 2 weeks for each flood event. 2) Perform for the three flood events high resolution analyses of 24h precipitation heights on the basis of surface observations.

    3) Develop a prototype-scheme for near real-time 24h precipitation analysis on the basis of Radar-data and synoptic precipitation observations.

    In addition to these tasks the operational model results according to task 1) for the period of the Central European flood were retrieved from the archives and supplied to the project ftp-server.

  • Deutscher Wetterdienst (DWD) meteorological data set for the development of a flood forecasting system

    DWD prepared data sets which include all meteorological fields necessary as input fields to hydrological models. Four flood cases in different European river basins for different seasons (autumn, winter and summer) were investigated:

    a) Po 1994, November, Autumn,

    b) Rhine, Meuse 1995, January, Winter,

    c) Odra 1997, July, Summer,

    d) Elbe 2002, August, Summer.

    The fields are based on the analysis of observed precipitation and on model forecasts:

    48 h forecasts by DWD's limited area model LM (ca. 7 km resolution, model area is Central Europe, data provided at hourly intervals);

    156 h forecasts by DWD's global model GME (model resolution ca. 60 km, data provided at 6 hourly intervals on a 0.75o

    0.75o grid with NW-corner at 75o N, 35o W and SE-corner at 30o N, 45o E);

    analyses of 24 h precipitation observations for the LM area in ca. 7 km resolution.

    _1108912510.unknown

  • Maps of the constant fields for GME and LM.

  • a)

    b)

    c)

    c) LM model prediction (18 to 42 hours forecast).

    PRECIPITATION DISTRIBUTION (kg/m2)for 05 Nov, 1994, 06 UTC to 06 Nov, 1994, 06 UTC:

    a) analysis based on synoptic (631) stations

    b) analysis based on synoptic (631) and MAP (5173) stations

  • Austria 263Czech Republic 800 Germany 4238Poland 1356Switzerland 435Alltogether 7092

  • 20022003200420052006ECMWF0.96 TfTL511 (40km) L6010 Tf20 Tf TL511(40km) L60TL799(25km) L91

    DWD1.92 Tf60km L317 km L352.88 Tf40km L40

    7 / 2 km L3518-28 Tf30km L45NCEP7.3 TfT170(80km) L4212km L60T254(50km) L6415.6 TfTL611(40km) L428 km 28 Tf 2007: G 30kmL 5 kmJMA Japan0.768 TfT106(120km) L4020 / 10 km L40TL319(60km) L426 Tf

    5 km L5020 Tf 2007: TL959(20km) L60CMA China0.384 TfT213(60km) L3125 km L203.84 Tf ?

    15 km

    2008: 5 km

    HMC Russia35 GfT85(150km) L3175 km L30T Tf ?T169(80km) L31

  • Computer equipment being readied for operational useECMWF: EQUIPMENT IN USE (end of 2003)

    Sheet1

    MachineProcessorsMemoryStorageTape Drives

    Fujitsu VPP5000100400 GB5.3 TB

    3 x HP K580 machines182.2 GB0.4 GB

    5 x IBM Nighthawk nodes1822 GB4 TB44

    MachineProcessorsMemoryStorageTape

    2 IBM Cluster 160018202500 GB12 TB

    5 x IBM p660 nodes2640 GB20 TB73

    3 HP K580 mashines1822 GB0.4 TB

    Sheet2

    Sheet3

  • Central Computer System (CCS)But what are we going to do if we have not CCS?

  • LINUX (Red Hat 7.3)PGI Workstation 4.0 (Portland Group Fortran and C++) HRM DWD (hydrostatic High Resolution Model) 93 x 73, 31 Layers, 0.1250 grid spacing (14 km), forecast for 48 hours

    AMD Duron 1300MHz 384 Mb PC 133 SDRAM 96 minAMD Athlon XP 1800+ MHz 256 Mb DDR266 RAM 81 minPentium 4 2.4 GHz 512 Mb DDR333 SDRAM 70 minIntel Xeon Workstation 1 processor 2.4 GHz 2048 Mb RDRAM PC 800 60 min 2 processors 2.4 GHz 2048 Mb RDRAM PC 800 33 min

    Result of V.Galabov (Bulgaria) experiments with different PC

  • program TestOMPinteger k, n, tid, nthreads, max_threads, procs logical dynamic, dynamic double precision d (5000)===== call gettim (hrs1,mins1,secs1,hsecs1) call getdat (year,month,day) max_threads = OMP_GET_MAX_THREADS() procs = OMP_GET_NUM_PROCS() dynamic = OMP_GET_DYNAMIC() nested = OMP_GET_NESTED()!$OMP PARALLEL PRIVATE (NTHREADS, tid, n, k) tid = OMP_GET_THREAD_NUM() nthreads = OMP_GET_NUM_THREADS()!$OMP DO SCHEDULE (STATIC, 5000) do n = 1 , 10000 do k = 1, 5000 d(k) = sin (dble(k+n))**2 + cos (dble(k+n))**2 end do end do!OMP END DO !$OMP END PARALLEL===== call gettim (hrs2,mins2,secs2,hsecs2) call getdat (year,month,day) end program TestOMP

  • Pentium 4 3.06 GHz; 2 Gb DDR DIMM PC3200; 120 Gb Seagate

    OSBIOSCompilerOpenMPTimeWindows XPThreadsDISABLEVisual Fortan 6.5-3.59 sWindows XPHyperThreadingsVisual Fortan 6.5-3.63 sLinux (Mandrake9.2)ThreadsDISABLEIntel Fortran 8.0+ & -3.59 sLinux (Mandrake9.2)HyperThreadingsIntel Fortran 8.0+2.38 s

  • The future (from E.Kalnay)An amazing improvement in the quality of the forecasts based on NWP guidance. From the active research currently taking place, one can envision that the next decade will continue to bring improvements, especially in the following areas: Detailed short-range forecasts, using storm-scale models able to provide skillful predictions of severe weather. More sophisticated methods of data assimilation able to extract the maximum possible information from observing systems, especially remote sensors such as satellites and radars. Development of adaptive observing systems, where additional observations are placed where ensembles indicate that there is rapid error growth (low predictability). Improvement in the usefulness of medium-range forecasts, especially through the use of ensemble forecasting. Fully coupled atmospheric-hydrological systems, where the atmospheric model precipitation is appropriately downscaled and used to extend the length of river flow prediction. More use of detailed atmosphere-ocean-land coupled models, where the effect of long lasting coupled anomalies such as SST and soil moisture anomalies leads to more skillful predictions of anomalies in weather patterns beyond the limit of weather predictability (about two weeks). More guidance to government and the public on areas such as air pollution, UV radiation and transport of contaminants, which affect health. An explosive growth of systems with emphasis on commercial applications of NWP, from guidance on the state of highways to air pollution, flood prediction, guidance to agriculture, construction, etc.

  • Observing system

    Telecommunication system

    Computer system

    Data assimilation

    Model

    Postprocessing