Past Applications, Lessons Learned, Current Thinking

30
Past Applications, Lessons Learned, Current Thinking Levi Brekke (Reclamation, Research & Development Office) NCPP Quantitative Evaluation of Downscaling Workshop, Boulder, CO Panel “Panel discussion : Using downscaled data in the real world: Sharing experiences: Part II”, 15 August 2013

description

Past Applications, Lessons Learned, Current Thinking. Levi Brekke (Reclamation, Research & Development Office). NCPP Quantitative Evaluation of Downscaling Workshop, Boulder, CO - PowerPoint PPT Presentation

Transcript of Past Applications, Lessons Learned, Current Thinking

Page 1: Past Applications, Lessons Learned,   Current Thinking

Past Applications, Lessons Learned, Current ThinkingLevi Brekke (Reclamation, Research & Development Office)

NCPP Quantitative Evaluation of Downscaling Workshop, Boulder, CO

Panel “Panel discussion : Using downscaled data in the real world: Sharing experiences: Part II”, 15 August 2013

Page 2: Past Applications, Lessons Learned,   Current Thinking

III. Conduct Planning Evaluations

II. Relate to Planning Assumptions

I. Choose Climate Context

Instrumental Records: observed weather (T and P)

and runoff (Q)

Demand Variability

System Analysis, Evaluate Study Questions(related to Resource Management Objectives)

Operating ConstraintsSupply Variability

Traditional climate context in planning

Page 3: Past Applications, Lessons Learned,   Current Thinking

I. Decision-Makers: “Keep it simple.”

II. Climate Information Providers: “Here’s the info… use it wisely.”

III. Technical Practitioners (Ushers): “Keep it Manageable.”

Page 4: Past Applications, Lessons Learned,   Current Thinking

3) Assess climate change impacts on planning assumptions (e.g., supplies, demands, and or water management constraints).

1) Survey Future Climate

Information over the study region

Analyses of Various Responses

2.a) Decide whether to cull

information, and how…

2.b) Decide how to use

retained information…

4) Assess operations and dependent resource responses; characterize uncertainties

Flow of Information:General View

Page 5: Past Applications, Lessons Learned,   Current Thinking

Box 2.a) Climate Model Contest … everyone wants it, utility unclear

Focusing on CA, Brekke et al. (2008) considered “historical” simulations from 17 GCMs, and found similar skill when enough metrics were considered. Focusing globally, Gleckler et al. 2008 and Reichler et al. 2008 found similar results.

Focusing on CA, projection distributions didn’t change much when the GCM-skill assessment (Brekke et al. 2008) was used to reduce the set of 17 GCMs to a “better” set of 9 GCMs.

Santer et al. PNAS 2009 – results from a global water vapor detection and attribution (D&A) study were largely insensitive to skill-based model weighting. Pierce et al. PNAS 2009 – results from western U.S. D&A study were more sensitive to ensemble size than skill-based model weighting.

Page 6: Past Applications, Lessons Learned,   Current Thinking

Box 2.b) Two Method Classes (generally speaking)

• Period-Change– prevalent among impacts studies– “perturbed historical” at some milestone future

• Transient– time-evolving view, from past to future– prevalent in the climate science community

(“climate projections”)

Page 7: Past Applications, Lessons Learned,   Current Thinking

7

1) Used UW CIG HB2860 scenarios (Period-Change + Transient)2) Selected smaller set of both scenario types

Period-Change type was of most interest. Goal was to select set that spans the rest:LW = less warming, MW = more warmingD = drierW = wetterC = centralMC = minimal change

PNW Example: Three Fed agencies (BPA, USACE, Reclamation) adopting consensus scenarios

Page 8: Past Applications, Lessons Learned,   Current Thinking

8

Scenarios selected for big-basin change… sub-basin changes didn’t always reflect the big-basin scenarios (e.g., Upper Snake is wetter if 5 of 6 scenarios)

Logical Process, but there were surprises

Page 9: Past Applications, Lessons Learned,   Current Thinking

Projection-specific approach: individual projections inform change definitions (e.g., black dot choices)

Ensemble-informed approach: projections are grouped and their

pooled information informs definitions

Reclamation 2010Oklahoma Reservoirs Yield Study

Page 10: Past Applications, Lessons Learned,   Current Thinking

… projection-specific approaches can lead to serial monthly impacts that seem questionable

… ensemble-informed approaches emphasize “consensus” changes from projections

… another concern comes from portrayal of monthly impacts

Two projection-specific methods

One ensemble-informed method

Reclamation 2010Oklahoma Reservoirs Yield Study

Page 11: Past Applications, Lessons Learned,   Current Thinking

Scoping with thought towards decision-making?

Page 12: Past Applications, Lessons Learned,   Current Thinking

Most info. generation approaches have been science-centric.

1. What do we think we know about future climate? (science synthesis, survey of past

and projected climate information)

2. Which climate information do we feel comfortable about relating to our

decisions?

3. Select an information frame that relates reliable future climate aspects to

decisions

Decision-Making

Science Application

Page 13: Past Applications, Lessons Learned,   Current Thinking

3. Select an information frame to relates relevant future climate aspects to

decisions..

2. What’s hydroclimate conditions are relevant to these decisions? (Scales!)

1. What are the various decisions that we are considering?

Decision-Making

Science Application

Would we select the same approach using a decision-centric view?

Page 14: Past Applications, Lessons Learned,   Current Thinking

Decision-Support Information

1. What do we think we know about future climate? (science synthesis, survey of past

and projected climate information)

2. Which climate information do we feel comfortable about relating to our

decisions?

3. Select an information frame that relates reliable future climate aspects to

decisions

Science-centric (What’s credible?)

3. Select an information frame that relates relevant future climate aspects to

decisions.

2. What’s the relevance of future hydroclimate for these decisions?

(Scales!)

1. What decisions are we considering?

Decision-Centric (What’s relevant?)

We can approach from both views…

Page 15: Past Applications, Lessons Learned,   Current Thinking

What’s Applicable.

What’s Reliable?

Global Climate Models’ Simulation Qualities

What’s Relevant?

System Sensitivities to Climate Changes

Practical limitations?

Tool Fitness, Project Resources

Another way of looking at this…

Page 16: Past Applications, Lessons Learned,   Current Thinking

Scoping Questions…

What’s relevant?

What are the study decisions and level of

interest in climate uncertainties?

What system metrics influence the study

decisions?

Which types of climate changes influence these

metrics the most?

What’s reliable?

What types of regional future climate and

hydrologic datasets are available?

Which climate and/or hydrologic changes are

projected well?

What future climate/hydrologic

assumptions should still be based on history?

Practical limitations?

What modeling steps are required to assess

system metrics?

How does climate change influence each

modeling step?

Which climate change influences are practical

to represent?

Page 17: Past Applications, Lessons Learned,   Current Thinking

FY13-14 Project: Evaluating the Relevance, Reliability, and Applicability of CMIP5 Climate Projections for Water

Resources and Environmental Planning

• Goal– develop & demonstrate a framework for

evaluating information relevance & reliability to guide judgment of applicability

• Approach– Broadband quality evaluation of CMIP5

(what’s reliable?); serve results on web– System sensitivity analyses (what’s

relevant?)– Applicability Pilots (observe process,

characterize framework for use elsewhere)• Collaborators (& POCs)

– Reclamation (Ian Ferguson, [email protected])– USACE (Jeffrey Arnold)– NOAA ESRL (Mike Alexander)– NOAA CIRES (Jamie Scott)

Page 18: Past Applications, Lessons Learned,   Current Thinking

Extras

Page 19: Past Applications, Lessons Learned,   Current Thinking

Two Method Classes have emerged…

• Period-Change– prevalent among impacts studies– “perturbed historical” at some milestone future

• Transient– time-evolving view, from past to future– prevalent in the climate science community

(“climate projections”)

Page 20: Past Applications, Lessons Learned,   Current Thinking

Period-Change: Overview• Historical climate variability sequence is retained (space and time)

• “Climate Change” Scenarios are defined for perturbing historical, where a change is diagnosed from a historical period to a future period

• Studies typically feature an Historical scenario and multiple climate change scenarios in order to reveal impacts uncertainty

• Several methods are available to define scenarios, differing by:– time (e.g., change in means, change in distributions) – space (e.g., change in regional condition, or change in spatilly

disaggregated conditions), and – amount of information (e.g., single climate projection, or many projections)

Page 21: Past Applications, Lessons Learned,   Current Thinking

Period-Change: Pros and Cons

• Pros:– Retains familiar historical variability patterns – Simple frame for exploring system sensitivity– Permits “cautious” sampling of temporal aspects from climate

projections (e.g., can be simple like change in annual mean, or complex like change in monthly distribution)

• Cons:– Less ideal for adaptation planning; climate change timing matters– Diagnosing period “Climate Change” is not obvious (more of a

problem for DP than for DT)– (when single-projections inform climate change scenarios) month-

to-month changes may seem disorderly or noisy

Page 22: Past Applications, Lessons Learned,   Current Thinking

Transient: Overview• Historical climate variability sequence not retained (but distribution

may be retained through climate projection bias-correction…)

• “Climate” Projections are selected to define an evolving envelope of climate possibility, representing simulated past to projected future– Monthly or daily time series projections typically used

• Climate Projections may be developed using various methods, e.g.:– Time series outputs from a GCM simulation (or a GCM-RCM simulation)– … bias-corrected and spatially downscaled translations of these outputs– … stochastically resampled (resequenced) versions of these outputs,

reflecting a different frequencies reference (observations, paleoproxies)

• Studies need to feature a large ensemble of climate projections to adequately portray an envelope of climate possibility through time

Page 23: Past Applications, Lessons Learned,   Current Thinking

Transient: Pros and Cons

• Pros:– Avoids challenges of “Climate Change” diagnosis

• Not discussed, but a key issue is “multi-decadal varaibility” in projections– Supports “master planning” for CC adaptation

• schedule of adapations through time, including project triggers

• Cons:– Projection historical sequences differ from experience– Requires “aggressive” sampling of temporal information from climate

projections (frequencies vary by member, and may be questionable)– Information is more complex

• Requires use of many projections, challenging analytical capacities, and requiring probabilistic discussion of results, evolving through time… requires learning phase

Page 24: Past Applications, Lessons Learned,   Current Thinking

III. Conduct Planning Evaluations

II. Relate to Planning Assumptions

I. Choose Climate Context

Instrumental Records: observed weather (T and P)

and runoff (Q)

Demand Variability

System Analysis, Evaluate Study Questions(related to Resource Management Objectives)

Operating ConstraintsSupply Variability

Legacy climate context for planning assumptions in water resources studies

Page 25: Past Applications, Lessons Learned,   Current Thinking

III. Conduct Planning Evaluations

We’ve developed ways to blend climate change information into this context.

II. Relate to Planning Assumptions

I. Choose Climate Context

Instrumental Records: observed weather (T and P)

and runoff (Q)

Demand Variability

Future Operations Portrayal for OCAP BA(flows, storage, deliveries, etc.)

Operating ConstraintsSupply Variability

Runoff

watershed simulationRegional T and P

Global Climate Projections: Representing various GCMs, forcing

bias-correction, spatial downscaling

Delta Flow-Salinity Relationship

Constraint on Upstream Operations

Global T and P… Sea Level Rise

…Stream Water Temperature analyses

Regional TReservoir Operations

e.g., Reclamation 2008, Mid-Pacific Region’s Central Valley Project – Operations Criteria and Plan, Biological Assessment

Page 26: Past Applications, Lessons Learned,   Current Thinking

III. Conduct Planning Evaluations

When using projected climate, future climate & hydrology assumptions typically reflect a blend of observed and projected information.

II. Relate to Planning Assumptions

I. Choose Climate Context

Instrumental Records: observed weather (T and P)

and runoff (Q)

Demand Variability

System Analysis, Evaluate Study Questions(related to Resource Management Objectives)

Operating ConstraintsSupply Variability

Runoff

watershed simulationRegional T and P

Global Climate Projections: Representing various GCMs, emissions

bias-correction, spatial downscaling

Reclamation 2010Info: Levi Brekke ([email protected]),Tom Pruitt ([email protected])

Runoff Magnitudes

Page 27: Past Applications, Lessons Learned,   Current Thinking

III. Conduct Planning Evaluations

II. Relate to Planning Assumptions

I. Choose Climate Context

Instrumental Records: observed weather (T and P)

and runoff (Q)

Demand Variability

System Analysis, Evaluate Study Questions(related to Resource Management Objectives)

Operating ConstraintsSupply Variability

Paleoclimate Proxies: reconstructed runoff (Q)

statistical modeling

http://www.usbr.gov/lc/region/programs/strategies.html

Runoff MagnitudesYeartype Spells

Runoff

… future climate & hydrology assumptions can also be based on blend of observed and paleoclimate information.

Page 28: Past Applications, Lessons Learned,   Current Thinking

III. Conduct Planning Evaluations

II. Relate to Planning Assumptions

I. Choose Climate Context

Instrumental Records: observed weather (T and P)

and runoff (Q)

Demand Variability

System Analysis, Evaluate Study Questions(related to Resource Management Objectives)

Operating ConstraintsSupply Variability

Paleoclimate Proxies: reconstructed runoff (Q)

statistical modeling

Instrumental Records: observed weather (T and P)

and runoff (Q)

watershed simulation

Global Climate Projections: Representing various GCMs, emissions

bias-correction, spatial downscaling

http://www.usbr.gov/research/docs/2009_Hydrology-DiffClimateBlends.pdf

… we can also possible blend all three. (Reclamation 2009, CRWAS 2011, others)

Yeartype SpellsRegional T and P

Runoff Magnitudes

Runoff

Page 29: Past Applications, Lessons Learned,   Current Thinking

3) Assess climate change impacts on planning assumptions related to water supply and power demands.

1) Survey Future Climate

Information over the study region

Hydrologic Simulation, Electricity Demand Modeling

2.a) Decide whether to cull

information, and how…

2.b) Decide how to use

retained information…

4) Assess operations response (Reclamation, USACE, and BPA systems & models)

Flow of Information: UW CIG HB 2860 Data …

Considered 100+ current projections…

Decided to focus on 19 projections…

Made two types of Columbia Basin weather and hydrology

http://warm.atmos.washington.edu/2860/ 860/

Page 30: Past Applications, Lessons Learned,   Current Thinking

3) Assess climate change impacts on planning assumptions related to water supply and power demands.

1) Survey Future Climate

Information over the study region

Hydrologic Response and Local Power Demand Response

2.a) Decide whether to cull

information, and how…

2.b) Decide how to use

retained information…

4) Assess operations and dependent resource responses; characterize uncertainties

Flow of Information: … used by Fed PNW agencies

Considered CIG’s 19 projections…

Decided to focus on smaller set…

Assessing operations under both types of information… insights for planning applications

Decided to use both types of Columbia Basin weather and hydrology…

http://www.usbr.gov/pn/programs/climatechange/reports/index.html