Coordination of Common Modeling Infrastructure Cecelia DeLuca WGCM/WMP Meeting, Exeter, UK...

17
Coordination of Common Modeling Infrastructure Cecelia DeLuca WGCM/WMP Meeting, Exeter, UK [email protected] Oct 6, 2005 Climate Data Assimilation Weather

Transcript of Coordination of Common Modeling Infrastructure Cecelia DeLuca WGCM/WMP Meeting, Exeter, UK...

Coordination of Common Modeling Infrastructure

Cecelia DeLuca WGCM/WMP Meeting, Exeter, [email protected] 6, 2005

ClimateData Assimilation

Weather

Outline

• What is ESMF?• How Do ESMF and PRISM Differ?• Why Do ESMF and PRISM Differ?• Can ESMF and PRISM Be Usefully Combined?• Model Metadata and Earth System Curator• How Can WMP Help?

ESMF Background

Three linked proposals were funded by NASA ESTO in 2002:

1. Core framework(Killeen/NCAR)

2. Modeling applications (Marshall/MIT)

3. Data assimilation applications (da Silva/NASA GSFC)

Original ESMF applications:NOAA GFDL atmospheresNOAA GFDL MOM4 oceanNOAA NCEP atmosphere, analysesNASA GMAO models and GEOS-5NASA/COLA Poseidon oceanLANL POP ocean and CICENCAR WRFNCAR CCSMMITgcm atmosphere and ocean

ESMF grew out of the now defunct Common Modeling Infrastructure Working Group, which involved many operational and research centers in the U.S.(Steve Zebiak and Robert Dickenson chairs).

New ESMF-Based ProgramsFunding for Science, Adoption, and Core DevelopmentModeling, Analysis and Prediction Program for Climate Variability and ChangeSponsor: NASAPartners:University of Colorado at Boulder, University of Maryland, Duke University, NASA Goddard Space Flight Center, NASA Langley, NASA Jet Propulsion Laboratory, Georgia Institute of Technology, Portland State University, University of North Dakota, Johns Hopkins University, Goddard Institute for Space Studies, University of Wisconsin, Harvard University, moreThe NASA Modeling, Analysis and Prediction Program will develop an ESMF-based modeling and analysis environment to study climate variability and change.

Battlespace Environments InstituteSponsor: Department of DefensePartners:DoD Naval Research Laboratory, DoD Fleet Numerical, DoD Army ERDC, DoD Air Force Air Force Weather AgencyThe Battlespace Environments Institute is developing integrated Earth and space forecasting systems that use ESMF as a standard for component coupling.

Integrated Dynamics through Earth’s Atmosphere and Space Weather InitiativesSponsors: NASA, NSFPartners: University of Michigan/SWMF, Boston University/CISM, University of Maryland, NASA Goddard Space Flight Center, NOAA CIRESESMF developers are working with the University of Michigan and others to develop the capability to couple together Earth and space software components.

Spanning the Gap Between Models and Datasets:Earth System CuratorSponsor: NSFPartners:Princeton University, Georgia Institute of Technology, Massachusetts Institute of Technology, PCMDI, NOAA GFDL, NOAA PMEL, DOE ESGThe ESMF team is working with data specialists to create an end-to-end knowledge environment that encompasses data services and models.

What is ESMF?• ESMF provides tools for turning model codes

into components with standard interfaces and standard drivers.

• ESMF provides data structures and common utilities that components use for routine services such as data communications, regridding, time management, configuration, and message logging.

ESMF InfrastructureData Classes: Bundle, Field, Grid, Array

Utility Classes: Clock, LogErr, DELayout, Machine

ESMF SuperstructureAppDriver

Component Classes: GridComp, CplComp, State

User Code

Outputs and outcomes …• Open-source, collaboratively developed software utilities and coupling interfaces,

exhaustive test suite, documentation, support and training.• A federation of geophysical components that can be assembled in multiple ways,

using different drivers and different couplers.• A Earth science organization that has focused interactions at many levels: software

engineer and support scientist, technical and scientific manager, scientist, director, sponsor.

• An extended community with strong connections and many diverse science options.

GEOS-5

surface fvcore gravity_wave_drag

history agcm

dynamics physics

chemistry moist_processes radiation turbulence

infrared solar lake land_ice data_ocean land

vegetation catchment

coupler

coupler coupler

coupler

coupler

coupler

coupler

• Each box is a user-written ESMF component• Every component has a standard interface so that it is (technically) swappable• Data in and out of components are packaged as state types with user-defined fields• New components can easily be added to the hierarchical system• Many different structures can be assembled by switching the tree around

ESMF Components and Couplers

Application Example:GEOS-5 AGCM

But!

I• It is possible to do a “wrap” of an existing model with ESMF, without needing to change

internal data structures, by just creating one Component box• This is generally lightweight in terms of performance• Users can choose to use all of ESMF or just some of it

• Measures overhead of ESMF superstructure in NCEP Spectral Statistical Analysis (SSI), ~1% overall

• Run on NCAR IBM

• Runs done by JPL staff, confirmed by NCEP developers

ESMF Development Status• Concurrent or sequential execution, single or multiple executable• Support for configuring ensembles• Logically rectangular grids with regular and arbitrary distributions can be

represented and regular distributions can be regridded• On-line parallel regridding (bilinear, 1st order conservative) implemented and

optimized• Other parallel methods - e.g. halo, redistribution, low-level comms implemented• Utilities such as time manager, logging, and configuration manager usable and

adding features• Fortran interfaces and complete documentation, some C++ interfaces

ESMF software is not yet a hardened, out-of-the-box solution

ESMF Platform Support• IBM AIX (32 and 64 bit addressing) • SGI IRIX64 (32 and 64 bit addressing) • SGI Altix (64 bit addressing) • Cray X1 (64 bit addressing)• Compaq OSF1 (64 bit addressing) • Linux Intel (32 and 64 bit addressing, with mpich and lam) • Linux PGI (32 and 64 bit addressing, with mpich) • Linux NAG (32 bit addressing, with mpich) • Linux Absoft (32 bit addressing, with mpich) • Linux Lahey (32 bit addressing, with mpich) • Mac OS X with xlf (32 bit addressing, with lam)• Mac OS X with absoft (32 bit addressing, with lam)• Mac OS X with NAG (32 bit addressing, with lam)

• User-contributed g95 support

Current ChallengesRefocus core development team • Base infrastructure is complete – now need support for unstructured grids, multi-block grids

with complex boundary behavior (e.g. tripole, cubed sphere), more regridding options, and constructs for data assimilation

• Team composition must change correspondingly• Better, smarter testing – suite of 1600 unit tests, 15 system tests, 30+ examples still needs

supplements • Major increase in demand for customer support and trainingMany new requirements• Commercial tool for tracking requirements (DOORS)• New representative body for prioritizing development tasks (Change Review Board)

Organizationally and technically, ESMF infrastructure will take another 3-5 years to mature

Run-time environment

ESMF v PRISM

ESMF InfrastructureData Classes: Bundle, Field, Grid, Array

Utility Classes: Clock, LogErr, DELayout, Machine

ESMF SuperstructureAppDriver

Component Classes: GridComp, CplComp, State

User Code

PRISM

Utility Infrastructure

User Code

Coupling Superstructure

ESMF

Other Differences … PRISM

• Components are generally in separate executables

• Components are generally not nested• Single coupler• Data is transferred through put/get• Data can go from anywhere to

anywhere in another component

Comp Comp Comp Comp

Coupler

ESMF

assim

sea iceocean

landatm

assim_atm

atmland

Seasonal Forecast

coupler

• Components are generally in the same executable

• Components are often nested• Multiple couplers• Data is passed through states at the

beginning and end of method execution

Motivation for Common Modeling Infrastructure

• Support for modeling workflows (e.g. job submission, version control, annotation and archival of experiments, integration with visualization and analysis tools)

• Model intercomparison and interchange of model components• Better utilization of compute resources and performance

optimization • Cost effectiveness: shared, fully featured common utilities (e.g.

logging, timing, regridding, calendars, I/O, parallelization tools)• Systematic internal architecture of multi-component models,

support for many different drivers and configurations ESMF

PRISM

Why Do ESMF and PRISM Differ? For both ESMF and PRISM, overall design was decided by a large

group of experienced modelers… so how did the two efforts wind up with such different solutions?

• PRISM single-driver approach leads to greater effective interoperability for a constrained (climate) domain

• ESMF approach leads to limited interoperability for a broader set of domains: climate, weather, space weather, data assimilation – support for seamless prediction

Both ESMF and PRISM face similar requirements – but have taken different paths to fulfill them

Can ESMF and PRISM be Usefully Combined? • ESMF can use PRISM run-time elements• PRISM can use the ESMF utility layer• ESMF can offer a put/get paradigm for greater flexibility• ESMF components can be described using PRISM PMIOD

files (XML description of model inputs/outputs and content), and ESMF data transfers expressed as PRISM put/gets, so that the same component can run in both systems (done with MOM4)

Model Metadata and Earth System CuratorEarth System Curator takes the interaction of ESMF/PRISM a step further:• Recognize models and datasets are described by similar metadata• Develop standards for model metadata, especially in the area of grids• Work with umbrella groups developing metadata standards (e.g. GO-ESSP) to

integrate model and data metadata• Work with groups developing ontologies (LEAD, ESML) to invest metadata

standards with structure and flexibility• Work with GFDL, CCSM and PCMDI to link databases that store models,

experiments, and data to serve MIPs and IPCCAnticipated result:• Coordinated growth of ESMF and PRISM• Opportunities to develop smarter tools (e.g. compatibility, assembly) based on

metadata information

How Can WMP Help?• Support and promote common modeling infrastructure

◦ Maintain a science-driven methodology◦ Emphasize long-term investment and continuity◦ Communicate expectations – the “plug and play” myth

• Support and promote efforts to generate metadata standards and ontologies◦ For the interaction of ESMF and PRISM◦ For the development of a more comprehensive and useful

modeling environment• Help determine how to utilize infrastructure as an entry point into

the broader (international) modeling community