Evaluating RCM Experiments PRECIS Workshop Tanzania Meteorological Agency, 29 th June – 3 rd July...

39
Evaluating RCM Experiments PRECIS Workshop Tanzania Meteorological Agency, 29 th June – 3 rd July 2015

description

What should we expect from a model? © Crown copyright Met Office “All models are wrong, but some are useful” George Box, 1987

Transcript of Evaluating RCM Experiments PRECIS Workshop Tanzania Meteorological Agency, 29 th June – 3 rd July...

Page 1: Evaluating RCM Experiments PRECIS Workshop Tanzania Meteorological Agency, 29 th June – 3 rd July 2015.

Evaluating RCM ExperimentsPRECIS WorkshopTanzania Meteorological Agency, 29th June – 3rd July 2015

Page 2: Evaluating RCM Experiments PRECIS Workshop Tanzania Meteorological Agency, 29 th June – 3 rd July 2015.

Objectives of the session

1. Understand the context for RCM evaluation2. Identify the main components of a model

evaluation3. Discuss different evaluation techniques and

aspects to consider4. Provide some examples

Page 3: Evaluating RCM Experiments PRECIS Workshop Tanzania Meteorological Agency, 29 th June – 3 rd July 2015.

What should we expect from a model?

© Crown copyright Met Office

“All models are wrong, but some are useful”

George Box, 1987

Page 4: Evaluating RCM Experiments PRECIS Workshop Tanzania Meteorological Agency, 29 th June – 3 rd July 2015.

What should we expect from a model?

Models are not “truth machines”

© Crown copyright Met Office

Page 5: Evaluating RCM Experiments PRECIS Workshop Tanzania Meteorological Agency, 29 th June – 3 rd July 2015.

“For numerical weather prediction, for example, skill is relatively well defined because forecasts can be verified on a daily basis.

For a climate model, it is more difficult to define a unique overall figure of merit, metric or skill score for long-term projections. Each model tends to simulate some aspects of the climate system well and others not so well, and each model has its own set of strengths and weaknesses.

We do not need a perfect model, just one that serves the purpose. An aeroplane, for example, can be constructed with the help of numerical models that are not able to properly simulate turbulent flow.”

R Knutti (2008) Should we believe model predictions of future climate change?

What should we expect from a model?

© Crown copyright Met Office

Page 6: Evaluating RCM Experiments PRECIS Workshop Tanzania Meteorological Agency, 29 th June – 3 rd July 2015.

What is a model evaluation and why is it important?

What:– An assessment of how well the model is able to simulate the “present

day”, observed climate– A model evaluation is not a model verification

Why:– It enables you to gain familiarity with the model characteristics– It indicates which aspects of the model simulation are most credible– …and therefore indicates how to make the best, most credible, use of

the data to answer relevant questions

© Crown copyright Met Office

Page 7: Evaluating RCM Experiments PRECIS Workshop Tanzania Meteorological Agency, 29 th June – 3 rd July 2015.

Undertaking a model evaluation

© Crown copyright Met Office

Page 8: Evaluating RCM Experiments PRECIS Workshop Tanzania Meteorological Agency, 29 th June – 3 rd July 2015.

The model evaluation process

• Identify the target and purpose of the evaluation

• Obtain multiple sources of observed data to evaluate model performance

• Assess the errors and biases in the GCMs that provide the LBCs for the RCM

• Evaluate the RCM acknowledging the multiple sources of uncertainty

© Crown copyright Met Office

Page 9: Evaluating RCM Experiments PRECIS Workshop Tanzania Meteorological Agency, 29 th June – 3 rd July 2015.

Identify the target and purpose of the evaluation

What aspects of the climate system are of most interest?

• What climate processes are key to understanding climate variability/change in the focus region?

• What variables (e.g. temperature, precipitation, humidity) are of most interest?

What time and space scales are of interest?

• Are you interested in extreme or rare events, or multi-year averages?

• Does the model need to provide accurate data at a specific spatial scale?

© Crown copyright Met Office

Page 10: Evaluating RCM Experiments PRECIS Workshop Tanzania Meteorological Agency, 29 th June – 3 rd July 2015.

Choice of observed data

Use as many relevant observed datasets as possible

Gridded datasets– Observed datasets – e.g. CRU (land surface), TRMM (satellite

rainfall), GPCP (merged rain gauge and satellite rainfall)– Reanalysis data – e.g. ERA-Interim (atmosphere)

Station data– Use with caution! It can be useful to compare directly to model

output but be aware of differences in spatial scales; ultimately one would not expect the data to match.

© Crown copyright Met Office

Page 11: Evaluating RCM Experiments PRECIS Workshop Tanzania Meteorological Agency, 29 th June – 3 rd July 2015.

Hewitson et al (2013) Interrogating empirical-statistical downscaling. Climatic Change, 122(4), 539-554

Different datasets provide different answers. There is no single truth but an envelope of possible or probable pasts.

Choice of observed data

© Crown copyright Met Office

Page 12: Evaluating RCM Experiments PRECIS Workshop Tanzania Meteorological Agency, 29 th June – 3 rd July 2015.

DJF (Winter) mean, maximum and minimum temperatures at each grid cell over the period 1963 to 2010 for West Africa; data taken from the CRU TS3.22 and UDEL dataset. (green lines show semi-arid region)

Choice of observed data

Page 13: Evaluating RCM Experiments PRECIS Workshop Tanzania Meteorological Agency, 29 th June – 3 rd July 2015.

Assess GCM data providing LBCs

Knutti et al 2013 GRL© Crown copyright Met Office

Page 14: Evaluating RCM Experiments PRECIS Workshop Tanzania Meteorological Agency, 29 th June – 3 rd July 2015.

Evaluating RCM Output

© Crown copyright Met Office

Page 15: Evaluating RCM Experiments PRECIS Workshop Tanzania Meteorological Agency, 29 th June – 3 rd July 2015.

Model system = GCM + RCM

Q1. Are there discrepancies in the model output?Between parts of the RCM and GCM model output

Between parts of the model output and ‘reality’

Q2. If so, why?

Systematic model bias (error in the model’s physical formulation)

Spatial sampling issues (differences in resolution of model and observations)

Observational error (gridding issues, instrument dependent errors)

© Crown copyright Met Office

Evaluating how well the RCM represents the current climate

Page 16: Evaluating RCM Experiments PRECIS Workshop Tanzania Meteorological Agency, 29 th June – 3 rd July 2015.

RCM GCM

Observations

consistency

realism realism

Evaluating how well the RCM represents the current climate

RCM errors have three sources:Physical errors in the GCM affecting the LBCs

Physical errors in the RCM

RCM/GCM consistency errors

© Crown copyright Met Office

Page 17: Evaluating RCM Experiments PRECIS Workshop Tanzania Meteorological Agency, 29 th June – 3 rd July 2015.

There is potential for four separate validations:GCM vs Observations

RCM driven by GCM vs GCM

RCM driven by GCM vs Observations

RCM driven by Observations vs Observations

© Crown copyright Met Office

Evaluating how well the RCM represents the current climate

RCM GCM

Observations

consistency

realism realism

Page 18: Evaluating RCM Experiments PRECIS Workshop Tanzania Meteorological Agency, 29 th June – 3 rd July 2015.

There is potential for four separate validations:GCM vs Observations

RCM driven by GCM vs GCM

RCM driven by GCM vs Observations

RCM driven by Observations vs Observations

© Crown copyright Met Office

Evaluating how well the RCM represents the current climate

RCM GCM

Observations

consistency

realism realism

Page 19: Evaluating RCM Experiments PRECIS Workshop Tanzania Meteorological Agency, 29 th June – 3 rd July 2015.

There is potential for four separate validations:GCM vs Observations

RCM driven by GCM vs GCM

RCM driven by GCM vs Observations

RCM driven by Observations vs Observations

© Crown copyright Met Office

Evaluating how well the RCM represents the current climate

RCM GCM

Observations

consistency

realism realism

Page 20: Evaluating RCM Experiments PRECIS Workshop Tanzania Meteorological Agency, 29 th June – 3 rd July 2015.

There is potential for four separate validations:GCM vs Observations

RCM driven by GCM vs GCM

RCM driven by GCM vs Observations

RCM driven by Observations vs Observations

© Crown copyright Met Office

Evaluating how well the RCM represents the current climate

RCM GCM

Observations

consistency

realism realism

Page 21: Evaluating RCM Experiments PRECIS Workshop Tanzania Meteorological Agency, 29 th June – 3 rd July 2015.

There is potential for four separate validations:GCM vs Observations

RCM driven by GCM vs GCM

RCM driven by GCM vs Observations

RCM driven by Observations vs Observations

© Crown copyright Met Office

Evaluating how well the RCM represents the current climate

RCM GCM

Observations

consistency

realism realism

Page 22: Evaluating RCM Experiments PRECIS Workshop Tanzania Meteorological Agency, 29 th June – 3 rd July 2015.

Aspects to consider in evaluation

Assess as many meteorological variables as possible

At least: Surface air temperature, precipitation, upper air winds

Examine the physical realism exhibited within the model

E.g. In cool and wet conditions we may expect high soil moisture. Is this so?

Use both spatial and temporal information

Spatial: Temporal:

Full fieldsSmaller areasVertical profilesArea averages

© Crown copyright Met Office

Time seriesSeasonal, annual and decadal meansHigher order statistics (variability, extremes)Different seasons, different regimes

Page 23: Evaluating RCM Experiments PRECIS Workshop Tanzania Meteorological Agency, 29 th June – 3 rd July 2015.

Compare like with like

Data only have skill at spatial scales resolved by their grids

Make sure to aggregate or interpolate datasets to the coarsest grid before comparing data

In general:

Average (Index) ≠ Index (Average)

When comparing datasets at different resolutions, must be careful to compare like with like

© Crown copyright Met Office

Aspects to consider in evaluation

Chen (2008) On the Verification and Comparison of Extreme Rainfall Indices from Climate Models, J Climate

Average data first

Index first

Page 24: Evaluating RCM Experiments PRECIS Workshop Tanzania Meteorological Agency, 29 th June – 3 rd July 2015.

Model forecasts (or hindcasts) are not constrained by the observations (i.e. weather) that actually happened. They are, however, constrained by forcings (i.e. CO2, lateral boundary data, surface boundary data).

Therefore, one cannot, in general, compare individual model years with their corresponding observed years. Rather, we are looking for agreement in the aggregated distribution of weather states (i.e. climate) over time.

However when models are run using observed boundary data from reanalyses, model year to actual year comparisons can be worthwhile – reanalysis data is “quasi-observed” data.

© Crown copyright Met Office

Aspects to consider in evaluation

Page 25: Evaluating RCM Experiments PRECIS Workshop Tanzania Meteorological Agency, 29 th June – 3 rd July 2015.

Limits of evaluating models against observations

© Crown copyright Met Office

Evaluation of climate models based on past climate observations has some important limitations.

• We can only evaluate those variables and phenomena for which observations exist.

• In some places, there is a lack of, or insufficient quality of, long-term observations.

• The presence of long-term climate variability.

These limitations can be reduced, but not entirely eliminated, through the use of multiple independent observations

of the same variable as well as the use of model ensembles.

Page 26: Evaluating RCM Experiments PRECIS Workshop Tanzania Meteorological Agency, 29 th June – 3 rd July 2015.

The Regional Climate Model Evaluation System (RCMES) was designed for addressing evaluation needs for programs such as CORDEX, NARCCAP, etc.

It was designed by NASA's & Caltech's Jet Propulsion Laboratory (JPL) and the University of California, Los Angeles (UCLA).

RCMES is composed of two main components:

1. The Regional Climate Model Evaluation Database (RCMED) 2. Regional Climate Model Evaluation Toolkit (RCMET)

Details of RCMES are presented at the web page: https://rcmes.jpl.nasa.gov/

© Crown copyright Met Office

Page 27: Evaluating RCM Experiments PRECIS Workshop Tanzania Meteorological Agency, 29 th June – 3 rd July 2015.

Examples

© Crown copyright Met Office

Page 28: Evaluating RCM Experiments PRECIS Workshop Tanzania Meteorological Agency, 29 th June – 3 rd July 2015.

Example 1: CORDEX RCMs, AfricaBiases in the simulated annual-mean precipitation (mm/day) against the CRU data.

From Kim et al (2013)Evaluation of the CORDEX-Africa multi-RCM hindcast systematic model errors.

Page 29: Evaluating RCM Experiments PRECIS Workshop Tanzania Meteorological Agency, 29 th June – 3 rd July 2015.

Example 2a: Seasonal mean precipitation, from PRECIS

© Crown copyright Met Office

Page 30: Evaluating RCM Experiments PRECIS Workshop Tanzania Meteorological Agency, 29 th June – 3 rd July 2015.

© Crown copyright Met Office

Example 2b: Frequency of wet days, from PRECIS

Page 31: Evaluating RCM Experiments PRECIS Workshop Tanzania Meteorological Agency, 29 th June – 3 rd July 2015.

Example 3: Extreme rainfall event in a river catchment (using PRECIS)

Area average precipitation in the Jhelum river basin (Pakistan) for September 1992, showing RCM simulations at 50 and 25 km and observations

© Crown copyright Met Office

Page 32: Evaluating RCM Experiments PRECIS Workshop Tanzania Meteorological Agency, 29 th June – 3 rd July 2015.

Example 4: Individual station vs. area averages

26 stations in a 25km×25km area (black bars) and their area averages, (red bars).

The area average (c.f. model grid box output) is considerably and inconsistently different to most individual stations

© Crown copyright Met Office

Page 33: Evaluating RCM Experiments PRECIS Workshop Tanzania Meteorological Agency, 29 th June – 3 rd July 2015.

What now? Use of the RCM output beyond the evaluation

Having evaluated the RCM output, is it appropriate to use the simulated future climate output directly?

For what scales, variables and types of questions is the model output able to provide “useful” information?

© Crown copyright Met Office

Page 34: Evaluating RCM Experiments PRECIS Workshop Tanzania Meteorological Agency, 29 th June – 3 rd July 2015.

To summarise

• There are many uncertainties which need to be taken into account when assessing climate change (and its impact) over a region

• Some account may currently be taken for most (BUT NOT ALL) uncertainties

• Even those uncertainties that can be accounted for are currently not well described

• There is a lot more work for us all to do!

Summary

Model evaluation is ESSENTIAL:

It enables familiarisation with the model and its projected output

A simulation may be over an area where the model performance is untested

An evaluation provides a baseline for assessing the credibility of future projections from RCMs, which has implications for how the output can and should be used

Page 35: Evaluating RCM Experiments PRECIS Workshop Tanzania Meteorological Agency, 29 th June – 3 rd July 2015.

Thanks for listening.

Questions?

Page 36: Evaluating RCM Experiments PRECIS Workshop Tanzania Meteorological Agency, 29 th June – 3 rd July 2015.

The ability of RCMs to simulate the regional climate depends strongly on the realism of the large-scale circulation that is provided by the LBCs, from the GCMs.

IPCC Fig 9.4, WG1, Chapter 9

Assess GCM data providing LBCs

© Crown copyright Met Office

Page 37: Evaluating RCM Experiments PRECIS Workshop Tanzania Meteorological Agency, 29 th June – 3 rd July 2015.

Raw Data:Various sources,

formats,Resolutions,

Coverage

RCMED(Regional Climate Model Evaluation Database)A large scalable database to store data from

variety of sources in a common format

RCMET(Regional Climate Model Evaluation Toolkit)A library of codes for extracting data from

RCMED and model and for calculating evaluation metrics

Metadata

Data TableData Table

Data Table

Data Table

Data TableData Table

Common Format,Native grid,

Efficient architecture

Extractor for various

data formats

TRMM

MODIS

AIRS

CERES

ETC

Soil moisture

Extract OBS data Extract model data

Userinput

Regridder(Put the OBS & model data on the

same time/space grid)

Metrics Calculator(Calculate evaluation metrics)

Visualizer(Plot the metrics)

URL

Use the re-

gridded data for user’s own

analyses and VIS.

Data extractor(Binary or netCDF)

Model dataOther Data Centers

(ESG, DAAC, ExArch Network)

© Crown copyright Met Office

Page 38: Evaluating RCM Experiments PRECIS Workshop Tanzania Meteorological Agency, 29 th June – 3 rd July 2015.

Break

Active

GCM RCM

Example 4: South Asian monsoon break-active phase precipitation

© Crown copyright Met Office

Page 39: Evaluating RCM Experiments PRECIS Workshop Tanzania Meteorological Agency, 29 th June – 3 rd July 2015.

Example 4: Observed precipitation over the Alps

Average rainfall for the period 1971-1990 from the (left) CRU 3.0 data set (resolution 0.5 x 0.5°) and the (right) Frei and Schaer Alpine analysis (resolution 0.3 x 0.22°).