Lars Nerger, Qi Tang, Dmitry Sidorenko...StronglyCoupledAssimilation WeaklyCoupledAssimilation...

1
Strongly Coupled Assimilation Weakly Coupled Assimilation Separate assimilation updates in atmosphere and ocean Separate state vectors for atmosphere and ocean No cross-covariances between compartments Other compartment only influenced dynamically in next forecast phase Joint assimilation update in atmosphere and ocean Joint state vector for atmosphere and ocean (distributed by parallelization) Utilize cross-covariances between compartments Assimilation influences all compartments directly Scalable Coupled Ensemble Data Assimilation with AWI-CM and PDAF Coupled Model: AWI-CM Lars Nerger, Qi Tang, Dmitry Sidorenko Alfred-Wegener Institute Helmholtz Center for Polar and Marine Research, Bremerhaven, Germany Contact: [email protected] http://www.awi.de Coupled Data Assimilation We show how to modify a coupled model so that we can use it for efficient ensemble data assimilation. The method uses a direct connection between the coupled model and the ensemble data assimilation framework PDAF [1, http://pdaf.awi.de]. Augmenting the model allows us to set up a data assimilation program with high flexibility and parallel scalability with only small changes to the model. The direct connection is obtained by 1. adapting the source codes of the coupled model so that it is able to run an ensemble of model states 2. adding a filtering step to the source codes. We discuss this connection for the coupled atmosphere-ocean model AWI-CM. For this coupled model, we have to augment the codes of both the ocean and atmosphere, adapt the parallelization, and add routines for the handling of observations and model fields specific for each model compartment. AWI-CM [2] consists of two separate programs: FESOM for the the ocean and ECHAM6 with JSBACH for the atmosphere and land surface. They are coupled with OASIS3-MCT. Fluxes between the models are computed and exchanged each 6 hours by OASIS3-MCT. Aaaaaaaa Aaaaaaaa aaaaaaaaa Stop Initialize Model Initialize coupler, grid & fields Post-processing Init_parallel_PDAF Do istep=1, nsteps Init_PDAF Assimilate_PDAF Start Initialize parallelization Model code Extension for data assimilation Initialize ensemble Parallel ensemble forecast Perform filter analysis step Add ensemble parallelization Additions to program flow Source code changes In OASIS3-MCT replace MPI_COMM_WORLD Add 1 line each in ECHAM and FESOM Data Assimilation with PDAF Atmosphere Ocean fluxes ocean/ice state OASIS3-MCT Overview PDAF provides parallelization support and fully-implemented and parallelized filters & smoothers. We add a few subroutine calls to model codes to enable ensemble assimilation without model restarts. PDAF is free open-source: Code, docu- mentation and tutorials available at http://pdaf.awi.de. ECHAM6 JSBACH land surface FESOM1.4 includes sea ice Time stepper In-compartment step & coupling Add 1 line each in ECHAM and FESOM Add 1 line each in ECHAM and FESOM Weakly coupled assimilation into the ocean State vector: ocean surface height, temperature, salinity, velocities Ensemble size: up to 46 Assimilation method: Local Error-Subspace Transform Kalman Filter (LESTKF) Simulation period: year 2016, daily assimilation update Compute Performance Run time for N=46: 12 hours (fully parallelized on 12,144 processors) Scaling test: increase ensemble size and number of processors § Slightly different forecast duration for each ensemble member § Run time only increases by 17% for 10-fold ensemble size RMSE: 2.12 o C RMSE: 1.01 o C No Assimilation Assimilation Temperature difference between model simulations and observations after 10 days. Initially there are very large temperature deviations as the coupled model does not know the reality. The assimilation significantly reduces the deviations from the observations globally. Salinity (psu) Model simulated and observed temperature and salinity at 100m depth after 4 assimilation days (observations shown as dots). There are still significant deviations, e.g. for the temperature in the equatorial region. Temperature ( o C) S T SST on 1/1/2016 Profile locations on 1/1/2016 Temperature Salinity Temperature & salinity Satellite SST EU Copernicus Marine Service Global daily data Data gaps due to clouds Temperature & salinity profiles EN4 data from UK MetOffice Global daily data Subsurface down to 5000m ~1000 profiles per day Observations Temperature & salinity 1/1/2016 at 100m depth Salinity (psu) Temperature Salinity Assimilation Experiments Results: Assimilation of SST Results: Assimilation of T & S Profiles References: [1] Nerger, L., Hiller, W. Software for Ensemble-based Data Assimilation Systems - Implementation Strategies and Scalability. Comp. & Geosci., (2013) 55: 110-118 [2] Sidorenko, D. et al. Towards multi-resolution global climate modeling with ECHAM6–FESOM. Part I: model formulation and mean climate, Clim. Dyn. (2015) 44: 757–780 5 10 15 20 ensemble size N 0.95 1 1.05 1.1 1.15 1.2 relative time (all ensemble members) Integration time (relative to mean of N=2) Temperature difference ( o C)

Transcript of Lars Nerger, Qi Tang, Dmitry Sidorenko...StronglyCoupledAssimilation WeaklyCoupledAssimilation...

  • Strongly Coupled Assimilation

    Weakly Coupled Assimilation

    • Separate assimilation updates in atmosphere andocean• Separate state vectors for atmosphere and ocean• No cross-covariances between compartments• Other compartment only influenced dynamically

    in next forecast phase

    • Joint assimilation update in atmosphere andocean• Joint state vector for atmosphere and ocean

    (distributed by parallelization)• Utilize cross-covariances between compartments• Assimilation influences all compartments directly

    Scalable Coupled Ensemble Data Assimilation with AWI-CM and PDAF

    Coupled Model: AWI-CM

    Lars Nerger, Qi Tang, Dmitry SidorenkoAlfred-Wegener Institute Helmholtz Center for Polar and Marine Research, Bremerhaven, Germany

    Contact: [email protected] http://www.awi.de

    Coupled Data AssimilationWe show how to modify a coupled model so that wecan use it for efficient ensemble data assimilation. Themethod uses a direct connection between the coupledmodel and the ensemble data assimilation frameworkPDAF [1, http://pdaf.awi.de]. Augmenting the modelallows us to set up a data assimilation program withhigh flexibility and parallel scalability with only smallchanges to the model.The direct connection is obtained by1. adapting the source codes of the coupled model so

    that it is able to run an ensemble of model states2. adding a filtering step to the source codes.We discuss this connection for the coupledatmosphere-ocean model AWI-CM. For this coupledmodel, we have to augment the codes of both theocean and atmosphere, adapt the parallelization, andadd routines for the handling of observations andmodel fields specific for each model compartment.

    AWI-CM [2] consists of two separate programs: FESOMfor the the ocean and ECHAM6 with JSBACH for theatmosphere and land surface. They are coupled withOASIS3-MCT. Fluxes between the models are computedand exchanged each 6 hours by OASIS3-MCT.

    AaaaaaaaAaaaaaaaaaaaaaaaa

    Stop

    Initialize ModelInitialize coupler, grid & fields

    Post-processing

    Init_parallel_PDAF

    Do istep=1, nsteps

    Init_PDAF

    Assimilate_PDAF

    Start

    Initialize parallelization

    Model code

    Extension fordata assimilation

    Initializeensemble

    Parallel ensembleforecast

    Perform filter analysis step

    Add ensemble parallelization

    Additions toprogram flow

    Source code changes

    In OASIS3-MCT replace MPI_COMM_WORLD

    Add 1 line each in ECHAM and FESOM

    Data Assimilation with PDAF

    759ECHAM6–FESOM: model formulation and mean climate

    1 3

    2013) and uses total wavenumbers up to 63, which corre-sponds to about 1.85 × 1.85 degrees horizontal resolution; the atmosphere comprises 47 levels and has its top at 0.01 hPa (approx. 80 km). ECHAM6 includes the land surface model JSBACH (Stevens et al. 2013) and a hydrological discharge model (Hagemann and Dümenil 1997).

    Since with higher resolution “the simulated climate improves but changes are incremental” (Stevens et al. 2013), the T63L47 configuration appears to be a reason-able compromise between simulation quality and compu-tational efficiency. All standard settings are retained with the exception of the T63 land-sea mask, which is adjusted to allow for a better fit between the grids of the ocean and atmosphere components. The FESOM land-sea distribu-tion is regarded as ’truth’ and the (fractional) land-sea mask of ECHAM6 is adjusted accordingly. This adjustment is accomplished by a conservative remapping of the FESOM land-sea distribution to the T63 grid of ECHAM6 using an adapted routine that has primarily been used to map the land-sea mask of the MPIOM to ECHAM5 (H. Haak, per-sonal communication).

    2.2 The Finite Element Sea Ice-Ocean Model (FESOM)

    The sea ice-ocean component in the coupled system is represented by FESOM, which allows one to simulate ocean and sea-ice dynamics on unstructured meshes with variable resolution. This makes it possible to refine areas of particular interest in a global setting and, for example, resolve narrow straits where needed. Additionally, FESOM allows for a smooth representation of coastlines and bottom topography. The basic principles of FESOM are described by Danilov et al. (2004), Wang et al. (2008), Timmermann et al. (2009) and Wang et al. (2013). FESOM has been validated in numerous studies with prescribed atmospheric forcing (see e.g., Sidorenko et al. 2011; Wang et al. 2012; Danabasoglu et al. 2014). Although its numerics are fun-damentally different from that of regular-grid models,

    previous model intercomparisons (see e.g., Sidorenko et al. 2011; Danabasoglu et al. 2014) show that FESOM is a competitive tool for studying the ocean general circulation. The latest FESOM version, which is also used in this paper, is comprehensively described in Wang et al. (2013). In the following, we give a short model description here and men-tion those settings which are different in the coupled setup.

    The surface computational grid used by FESOM is shown in Fig. 1. We use a spherical coordinate system with the poles over Greenland and the Antarctic continent to avoid convergence of meridians in the computational domain. The mesh has a nominal resolution of 150 km in the open ocean and is gradually refined to about 25 km in the northern North Atlantic and the tropics. We use iso-tropic grid refinement in the tropics since biases in tropi-cal regions are known to have a detrimental effect on the climate of the extratropics through atmospheric teleconnec-tions (see e.g., Rodwell and Jung 2008; Jung et al. 2010a), especially over the Northern Hemisphere. Grid refinement (meridional only) in the tropical belt is employed also in the regular-grid ocean components of other existing climate models (see e.g., Delworth et al. 2006; Gent et al. 2011). The 3-dimensional mesh is formed by vertically extending the surface grid using 47 unevenly spaced z-levels and the ocean bottom is represented with shaved cells.

    Although the latest version of FESOM (Wang et al. 2013) employs the K-Profile Parameterization (KPP) for vertical mixing (Large et al. 1994), we used the PP scheme by Pacanowski and Philander (1981) in this work. The rea-son is that by the time the coupled simulations were started, the performance of the KPP scheme in FESOM was not completely tested for long integrations in a global setting. The mixing scheme may be changed to KPP in forthcom-ing simulations. The background vertical diffusion is set to 2 × 10−3 m2s−1 for momentum and 10−5 m2s−1 for potential temperature and salinity. The maximum value of vertical diffusivity and viscosity is limited to 0.01 m2s−1. We use the GM parameterization for the stirring due to

    Fig. 1 Grids correspond-ing to (left) ECHAM6 at T63 (≈ 180 km) horizontal resolu-tion and (right) FESOM. The grid resolution for FESOM is indicated through color coding (in km). Dark green areas of the T63 grid correspond to areas where the land fraction exceeds 50 %; areas with a land fraction between 0 and 50 % are shown in light green

    Atmosphere Ocean

    fluxes

    ocean/ice state

    759ECHAM6–FESOM: model formulation and mean climate

    1 3

    2013) and uses total wavenumbers up to 63, which corre-sponds to about 1.85 × 1.85 degrees horizontal resolution; the atmosphere comprises 47 levels and has its top at 0.01 hPa (approx. 80 km). ECHAM6 includes the land surface model JSBACH (Stevens et al. 2013) and a hydrological discharge model (Hagemann and Dümenil 1997).

    Since with higher resolution “the simulated climate improves but changes are incremental” (Stevens et al. 2013), the T63L47 configuration appears to be a reason-able compromise between simulation quality and compu-tational efficiency. All standard settings are retained with the exception of the T63 land-sea mask, which is adjusted to allow for a better fit between the grids of the ocean and atmosphere components. The FESOM land-sea distribu-tion is regarded as ’truth’ and the (fractional) land-sea mask of ECHAM6 is adjusted accordingly. This adjustment is accomplished by a conservative remapping of the FESOM land-sea distribution to the T63 grid of ECHAM6 using an adapted routine that has primarily been used to map the land-sea mask of the MPIOM to ECHAM5 (H. Haak, per-sonal communication).

    2.2 The Finite Element Sea Ice-Ocean Model (FESOM)

    The sea ice-ocean component in the coupled system is represented by FESOM, which allows one to simulate ocean and sea-ice dynamics on unstructured meshes with variable resolution. This makes it possible to refine areas of particular interest in a global setting and, for example, resolve narrow straits where needed. Additionally, FESOM allows for a smooth representation of coastlines and bottom topography. The basic principles of FESOM are described by Danilov et al. (2004), Wang et al. (2008), Timmermann et al. (2009) and Wang et al. (2013). FESOM has been validated in numerous studies with prescribed atmospheric forcing (see e.g., Sidorenko et al. 2011; Wang et al. 2012; Danabasoglu et al. 2014). Although its numerics are fun-damentally different from that of regular-grid models,

    previous model intercomparisons (see e.g., Sidorenko et al. 2011; Danabasoglu et al. 2014) show that FESOM is a competitive tool for studying the ocean general circulation. The latest FESOM version, which is also used in this paper, is comprehensively described in Wang et al. (2013). In the following, we give a short model description here and men-tion those settings which are different in the coupled setup.

    The surface computational grid used by FESOM is shown in Fig. 1. We use a spherical coordinate system with the poles over Greenland and the Antarctic continent to avoid convergence of meridians in the computational domain. The mesh has a nominal resolution of 150 km in the open ocean and is gradually refined to about 25 km in the northern North Atlantic and the tropics. We use iso-tropic grid refinement in the tropics since biases in tropi-cal regions are known to have a detrimental effect on the climate of the extratropics through atmospheric teleconnec-tions (see e.g., Rodwell and Jung 2008; Jung et al. 2010a), especially over the Northern Hemisphere. Grid refinement (meridional only) in the tropical belt is employed also in the regular-grid ocean components of other existing climate models (see e.g., Delworth et al. 2006; Gent et al. 2011). The 3-dimensional mesh is formed by vertically extending the surface grid using 47 unevenly spaced z-levels and the ocean bottom is represented with shaved cells.

    Although the latest version of FESOM (Wang et al. 2013) employs the K-Profile Parameterization (KPP) for vertical mixing (Large et al. 1994), we used the PP scheme by Pacanowski and Philander (1981) in this work. The rea-son is that by the time the coupled simulations were started, the performance of the KPP scheme in FESOM was not completely tested for long integrations in a global setting. The mixing scheme may be changed to KPP in forthcom-ing simulations. The background vertical diffusion is set to 2 × 10−3 m2s−1 for momentum and 10−5 m2s−1 for potential temperature and salinity. The maximum value of vertical diffusivity and viscosity is limited to 0.01 m2s−1. We use the GM parameterization for the stirring due to

    Fig. 1 Grids correspond-ing to (left) ECHAM6 at T63 (≈ 180 km) horizontal resolu-tion and (right) FESOM. The grid resolution for FESOM is indicated through color coding (in km). Dark green areas of the T63 grid correspond to areas where the land fraction exceeds 50 %; areas with a land fraction between 0 and 50 % are shown in light green

    OASIS3-MCT

    Overview

    PDAF provides parallelization support andfully-implemented and parallelized filters& smoothers. We add a few subroutinecalls to model codes to enable ensemble

    assimilation without model restarts.PDAF is free open-source: Code, docu-mentation and tutorials available athttp://pdaf.awi.de.

    ECHAM6JSBACH land surface

    FESOM1.4includes sea ice

    Time stepperIn-compartment step & coupling

    Add 1 line each in ECHAM and FESOM

    Add 1 line each in ECHAM and FESOM

    • Weakly coupled assimilation into the ocean• State vector: ocean surface height, temperature, salinity, velocities• Ensemble size: up to 46• Assimilation method: Local Error-Subspace Transform Kalman Filter (LESTKF)• Simulation period: year 2016, daily assimilation update

    Compute Performance• Run time for N=46: 12 hours

    (fully parallelized on 12,144 processors)• Scaling test: increase ensemble size and

    number of processors§ Slightly different forecast duration

    for each ensemble member§ Run time only increases by 17% for

    10-fold ensemble size

    RMSE: 2.12oC RMSE: 1.01oC

    No Assimilation Assimilation

    Temperature difference between model simulations and observations after 10 days.Initially there are very large temperature deviations as the coupled model does notknow the reality. The assimilation significantly reduces the deviations from theobservations globally.

    Salinity (psu)

    Model simulated and observed temperature and salinity at 100m depth after 4assimilation days (observations shown as dots). There are still significant deviations,e.g. for the temperature in the equatorial region.

    Temperature (oC)

    ST

    SST on 1/1/2016 Profile locations on 1/1/2016

    Temperature Salinity Temperature & salinity

    Satellite SST• EU Copernicus Marine Service• Global daily data• Data gaps due to clouds

    Temperature & salinity profiles• EN4 data from UK MetOffice• Global daily data• Subsurface down to 5000m• ~1000 profiles per day

    Observations

    Temperature & salinity1/1/2016 at 100m depth

    Salinity(psu) Temperature Salinity

    Assimilation Experiments

    Results: Assimilation of SST

    Results: Assimilation of T & S Profiles

    References:[1] Nerger, L., Hiller, W. Software for Ensemble-based Data Assimilation Systems - Implementation Strategies and

    Scalability. Comp. & Geosci., (2013) 55: 110-118[2] Sidorenko, D. et al. Towards multi-resolution global climate modeling with ECHAM6–FESOM. Part I: model

    formulation and mean climate, Clim. Dyn. (2015) 44: 757–780

    5 10 15 20ensemble size N

    0.95

    1

    1.05

    1.1

    1.15

    1.2

    rela

    tive

    time

    (all

    ense

    mbl

    e m

    embe

    rs) Integration time (relative to mean of N=2)

    Temperature difference (oC)