nsf ncar / nasa gsfc / doe lanl anl / noaa ncep gfdl / mit / u mich c. deluca/ncar, j....
TRANSCRIPT
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
C. DeLuca/NCAR, J. Anderson/NCAR, V. Balaji/GFDL, B. Boville/NCAR, N. Collins/NCAR, T. Craig/NCAR, C. Cruz/GSFC, A. da Silva/GSFC, R. Hallberg/GFDL, C. Hill/MIT, M. Iredell/NCEP, R. Jacob/ANL, P. Jones/LANL, B. Kauffman/NCAR, J. Larson/ANL, J. Michalakes/NCAR, E. Schwab/NCAR, S. Smithline/GFDL, Q. Stout/U Mich, M. Suarez/GSFC, A. Trayanov/GSFC, S. Vasquez/NCAR, J. Wolfe/NCAR, W. Yang/NCEP, M. Young/NCEP and L. Zaslavsky/GSFC
Introduction to the Earth System Modeling Framework
NSIPP Seasonal Forecast
NCAR/LANL CCSM NCEP ForecastGFDL FMS Suite
MITgcmNASA GSFC PSASClimate
Data Assimilation
Weather
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Outline
• ESMF Overview• ESMF Applications• Related Projects• ESMF Architecture• Timeline
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Background
NASA’s Earth Science Technology Office proposed the creation of anEarth System Modeling Framework (ESMF) in the September 2000 NASA Cooperative Agreement Notice:
“Increasing Interoperability and Performance of Grand Challenge Applications in the Earth, Space, Life and Microgravity Sciences”
A large, interagency collaboration with roots in the Common Modeling Infrastructure Working Group proposed three interlinked projects to develop and deploy the ESMF, which were all funded:
Part I: Core ESMF Development (PI: Killeen, NCAR)Part II: Modeling Applications (PI: Marshall, MIT)Part III: Data Assimilation Applications (PI: da Silva, NASA GMAO)
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Motivation
In climate research and NWP... increased emphasis on detailed representation of individual physical processes; requires many teams of specialists to contribute components to an overall modeling system
In computing technology... increase in hardware and software complexity in high-performance computing, as we shift toward the use of scalable computing architectures
In software …development of frameworks, such as FMS, GEMS, CCA and WRF, that encourage software reuse and interoperability
The ESMF is a focused community effort to tame the complexity of models and
the computing environment. It leverages, unifies and extends existing software
frameworks, creating new opportunities for scientific contribution and collaboration.
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
ESMF Project DescriptionGOALS: To increase software reuse, interoperability, ease of use and performance portability in
climate, weather, and data assimilation applications
PRODUCTS: • Core framework: Software for coupling geophysical components and utilities for building
components • Applications: Deployment of the ESMF in 15 of the nation’s leading climate and weather
models, assembly of 8 new science-motivated applications
METRICS:
RESOURCES and TIMELINE: $9.8M over 3 years
Reuse Interoperability Ease of Adoption Performance
15 applications use ESMF component coupling services and 3+ utilities
8 new applications comprised of never-before coupled components
2 codes adopt ESMF with < 2% lines of code changed, or within 120 FTE-hours
No more than 10% overhead in time to solution, no degradation in scaling
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
ESMF Accelerates Advances in Earth System Science
Eliminates software barriers to collaboration among organizations• Easy exchange of model components accelerates progress in NWP and climate
modeling• Independently developed models and data assimilation methods can be combined and
tested• Coupled model development becomes a truly distributed process• Advances from smaller academic groups easily adopted by large modeling centers
Facilitates development of new interdisciplinary collaborations• Simplifies extension of climate models to upper atmosphere• Accelerates inclusion of advanced biogeochemical components into climate models• Develops clear path for many other communities to use, improve, and extend climate
models• Many new model components gain easy access to power of data assimilation
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
ESMF Collaborators
NSF NCARTim Killeen, PIJeff Anderson Byron BovilleNancy Collins Cecelia DeLucaRoberta JohnsonAl KellieJohn MichalakesDavid NeckelsEarl Schwab Robbie StauferSilverio Vasquez Jon Wolfe
DOE ANLRob JacobJay Larson
NOAA GFDLAnts LeetmaaV. BalajiRobert HallbergShep Smithline
NASA GMAOArlindo da Silva, PIMichele RieneckerMax SuarezAtanas TrayanovChristian KeppenneLeonid ZaslavskyWill SawyerCarlos Cruz
NOAA NCEPStephen LordMark IredellMike YoungWeiyu YangJohn Derber
MITJohn Marshall, PIChris Hill
DOE LANLPhil Jones
University of MichiganQuentin Stout
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Outline
• ESMF Overview• ESMF Applications• Related Projects• ESMF Architecture• Timeline
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Modeling Applications
SOURCE APPLICATION
GFDL FMS B-grid atmosphere at N45L18
FMS spectral atmosphere at T63L18
FMS MOM4 ocean model at 2°x2°xL40
FMS HIM isopycnal C-language ocean model at 1/6°x1/6°L22
MIT MITgcm coupled atmosphere/ocean at 2.8°x2.8°, atmosphere L5, ocean L15
MITgcm regional and global ocean at 15kmL30
GMAO/NSIPP NSIPP atmospheric GCM at 2°x2.5°xL34 coupled with NSIPP ocean GCM at 2/3°x1.25°L20
NCAR/LANL CCSM2 including CAM with Eulerian spectral dynamics and CLM at T42L26 coupled with POP ocean and data ice model at 1°x1°L40
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Data Assimilation Applications
SOURCE APPLICATION
GMAO/DAO PSAS interface layer with 2O0K observations/day
CAM with finite volume dynamics at 2°x2.5°L55, including CLM
NCEP Global atmospheric spectral model at T170L42
SSI analysis system with 250K observations/day, 2 tracers
WRF regional atmospheric model at 22km resolution CONUS forecast 345x569L50
GMAO/NSIPP ODAS with OI analysis system at 1.25°x1.25°L20 resolution with ~10K observations/day
MIT MITgcm 2.8° century / millennium adjoint sensitivity
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Interoperability Experiments:8 New Applications
COUPLED CONFIGURATION NEW SCIENCE ENABLED
GFDL B-grid atm / MITgcm ocn Global biogeochemistry (CO2, O2), SI timescales.
GFDL MOM4 / NCEP forecast NCEP seasonal forecasting system.
NSIPP ocean / LANL CICE Sea ice model for extension of SI system to centennial time scales.
NSIPP atm / DAO analysis Assimilated initial state for SI.
DAO analysis / NCEP model Intercomparison of systems for NASA/NOAA joint center for satellite data assimilation.
DAO CAM-fv / NCEP analysis Intercomparison of systems for NASA/NOAA joint center for satellite data assimilation.
NCAR CAM Eul / MITgcm ocn Improved climate predictive capability: climate sensitivity to large component interchange, optimized initial conditions.
NCEP WRF / GFDL MOM4 Development of hurricane prediction capability.
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Outline
• ESMF Overview• ESMF Applications• Related Projects• ESMF Architecture• Timeline
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Related Projects
• PRISM is an ongoing European Earth system modeling infrastructure project
• Involves current state-of-the-art atmosphere, ocean, sea-ice, atmospheric chemistry, land-surface and ocean-biogeochemistry models
• 22 partners: leading climate researchers and computer vendors, includes MPI, KNMI, UK Met Office, CERFACS, ECMWF, DMI
• ESMF is working with PRISM to mergeframeworks and develop common conventions
• CCA is creating a minimal interface and sets of tools for linking high performance components. CCA can be used to implement frameworks and standards developed in specific domains (such as ESMF).
• Collaborators include LANL, ANL, LLNL, ORNL, Sandia, University of Tennessee, and many more. Ongoing ESMF collaboration with CCA/LANL on language interoperability.
• Working prototype demonstrating CCA/ESMF interoperability, to be presented at SC2003.
For joint use with PRISM, ESMFdeveloped a component databaseto store component import/export fields and component descriptions
For joint use with PRISM, ESMFdeveloped a component databaseto store component import/export fields and component descriptions
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
ESMF Connections
Larson/ANLDeLuca/NCAR-SCDJones/LANLStout/U MichKilleen/NCARDrake/ORNLBoville/NCAR-CGDMichalakes/NCAR-MMMSuarez/NASA GoddardBalaji/NOAA GFDL
CCA
WRF
SciDAC
CCSM
GEMSFMSSWMF
PRISM
ESMF
ESMF Earth System Modeling FrameworkCCA DOE Common Component ArchitectureSciDAC DOE/NSF CCSM SciDAC ProjectGEMS Goddard Earth Modeling SystemFMS GFDL Flexible Modeling SystemSWMF Space Weather Modeling FrameworkWRF Weather Research and Forecast ModelCCSM Community Climate System ModelPRISM Program for Int. Earth System Modeling
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Outline
• ESMF Overview• ESMF Applications• Related Projects• ESMF Architecture• Timeline
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Computational Characteristicsof Weather/Climate• Mix of global transforms and local communications• Load balancing for diurnal cycle, event (e.g. storm) tracking• Applications typically require 10s of GFLOPS,
100s of PEs – but can go to 10s of TFLOPS, 1000s of PEs• Required Unix/Linux platforms span laptop to
Earth Simulator• Multi-component applications: component
hierarchies, ensembles, and exchanges• Data and grid transformations between
components • Applications may be MPMD/SPMD,
concurrent/sequential, combinations • Parallelization via MPI, OpenMP, shmem, combinations• Large applications (typically 100,000+ lines of source code)
Platforms
assim
sea iceocean
landatm
physics dycore
assim_atm
atmland
Seasonal Forecastcoupler
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Architecture
Low Level Utilities
Fields and Grids Layer
Model Layer
Components Layer:Gridded ComponentsCoupler Components
ESMF Infrastructure
User Code
ESMF Superstructure
BLAS, MPI, NetCDF, …External Libraries
1. ESMF provides an environment for assembling geophysical components into applications.
2. ESMF provides a toolkit that components use to
i. increase interoperability
ii. improve performance portability
iii. abstract common services
Becoming an ESMF Component• Pack model import and export data into ESMF data structures and conform to a standard
calendar. Use ESMF utilities internally as desired. Organize model using standard ESMF methods: Initialize, Run, Finalize, ReadRestart, WriteRestart. Methods may be multi-phase (Run phase=1, Run phase=2). Method interfaces are prescribed.
• Instantiate an ESMF Component with name, type, config information. Register standard model methods with Component. If desired, register data.
• Use ESMF AppDriver to sequence and run Components.
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Design Features
ESMF enables modeling applications to be:• Scalable
Models are built from modular components, and can be easily nested within larger applications
• Performance - portable ESMF high-performance communication libraries offer a consistent interface across computer architectures
• Exchangeable Standard component interfaces enable interoperability
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
ESMF Class Structure
DELayoutCommunications
StateData imported or exported
BundleCollection of fields
GridCompLand, ocean, atm, … model
F90
Superstructure
Infrastructure
FieldPhysical field, e.g. pressure
GridLogRect, Unstruct, etc.
Data Communications
C++
RegridComputes interp weights
CplCompXfers between GridComps
UtilitiesMachine, TimeMgr, LogErr, I/O, Config, Base etc.
ArrayHybrid F90/C++ arrays Route
Stores comm paths
DistGridGrid decomposition
PhysGridMath description
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Design Principle:Local CommunicationAll inter-Component communication within ESMF is local - all communication is handled within Components. This allows the architecture of the framework to be independent of the communication strategy.
climate_comp
ocn_comp atm_comp
atm_phys
phys2dyn_coupler
atm_dynPE
atm2ocn _couplerAs a consequence, Coupler Components must be defined on the union of the PEs of all the Components that they couple.
In this example, in order to send data from the atmosphere Component to the ocean, the atm2ocn Coupler mediates the send.
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Design Principle:Scalable Applications
Since each ESMF application is also a Component, entire ESMF applications can be treated as Gridded Components and nested within larger applications.
climate_comp
ocn_comp atm_comp
ocn2atm_coupler
atm_phys
phys2dyn_coupler
atm_dynPE
Example: atmospheric application itself composed of multiple Components may be run standalone, or nested within a larger climate application
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Design Principle: ModularityGridded Components don’t have access to the internals of other Gridded Components, and don’t store
any coupling information. Gridded Components may:• pass their States to other Components through their argument list or• receive user-defined methods through their argument list for transforming and transferring States.
These are called Transforms, and they contain a function pointer and attributes describing frequency and validity criteria. Transforms may modify the data in States, receive States from other Components, send States to other Components, etc.
! From climate compcall ESMF_CompRun(atm, atm_xform)call ESMF_CompRun(ocn, ocn_xform)
! From atm compcall ESMF_StateXform(atm_ex_state, & atm_xform)! From ocn compcall ESMF_StateXform(ocn_im_state, &
ocn_xform)
EX 1: One-way coupling from atm to ocnatm_xform is a send; ocn_xform a receive
EX 2: Standalone atmatm_xform is a no-op
! From atm compcall ESMF_CompRun(ocn, ocn_xform)call ESMF_StateXform(atm_ex_state, &
atm_xform)
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Design Principle:Uniform Communication APIThe same programming interface is used for shared memory, distributed memory, and combinations thereof.
Machine model = abstraction of machine architecture (num_nodes, num_pes_per_node, etc.)
DE = a decomposition element - may be virtual, thread, MPI process
DELayout = an arrangement of DEs, in which dimensions requiring faster communication may be specified and resources arranged accordingly
DELayout: 4 x 3, ESMF_COMM_SHR in x and ESMF_COMM_MP in yThe data in a Grid is decomposed according to the number and topology of DEs in the DELayout
Performance of ESMF AllGather vs. raw MPI
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Outline
• ESMF Overview• ESMF Applications• Related Projects• ESMF Architecture• Timeline
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Time Line
May 2002 Draft Developer’s Guide and Requirements Document completed
1st Community Requirements Meeting and review held in D.C.
July 2002 ESMF VAlidation (EVA) suite assembled
August 2002 Architecture Document: major classes and their relationships
Implementation Report: language strategy and programming model
Software Build and Test Plan: sequencing and validation
May 2003 ESMF Version 1.0 release, 2nd Community Meeting at GFDL
November 2003 First 3 interoperability experiments completed
April 2004 Second API and production software release, 3rd Community Meeting
November 2004 All interoperability experiments complete; all testbed applications compliant
January 2005 Final delivery of source code and documentation
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
More Information
ESMF website: http://www.esmf.ucar.edu
AcknowledgementsThe ESMF is sponsored by the NASA Goddard Earth Science Technology Office.