Harmon, Uncertainty analysis: An evaluation metric for synthesis science

Post on 14-Jul-2015

360 views 3 download

Tags:

Transcript of Harmon, Uncertainty analysis: An evaluation metric for synthesis science

Uncertainty Analysis: An Evaluation Metric for Synthesis Science

Mark E. Harmon Richardson Chair and Professor

Department of Forest Ecosystems and SocietyOregon State University

ESA 2013 Organized Session

Two Complementary Sides to Science

• Reductionist– Reduce down

– Simplify

– Control confounding factors

– Additive to degree possible

• Synthesis– Build up

– Address Complexity

– Retain confounding factors

– Interactive, whole more than

sum of parts (?)

?

Sources of Uncertainty-1

• Measurement error (experimental error)

• Natural variation in space and time

• Model parameter error

• Model selection error

Sources of Uncertainty-2

• Measurement error (experimental error)

– Accuracy: how close to the truth?

– Precision: how repeatable?

– Detection limits: how small?

• Primarily considered in:

– Laboratory analyses

– Climate, hydrologic, ecophysiologyinstrumentation

Sources of Uncertainty-3

• Natural variation in space and time– Improve estimates of mean and variation via

sample design

– Cannot be completely eliminated

• Primarily considered in:– Field sampling

– Field experiments

– Statistical tests

Sources of Uncertainty-4

• Model parameter error– Simple to complex conversions of one variable to

another requires a model

– Uncertainty of parameter value

– Can be reduced but not eliminated completely

• Primarily considered in:– Ecosystem estimates

– Contrast these conversions

– BA= Π*DBH2/4 vs Biomass=B1*DBHB2

Sources of Uncertainty-4

• Model parameter error– Simple to complex conversions of one variable to

another requires a model

– Uncertainty of parameter value

– Can be reduced but not eliminated completely

• Primarily considered in:– Ecosystem estimates

– Contrast these conversions

– BA= Π*DBH2/4 vs Biomass=B1*DBHB2

Sources of Uncertainty-5

• Model selection error– Knowledge uncertainty of how to proceed

– Introduces a systematic, not a random error

– Can only be reduced with more knowledge

• Primarily considered in:– Ecosystem estimates

– Simulation models

– Synthetic efforts

– Example: Are tree stems• Cones? Neiloids? or Paraboloids?

Sources of Uncertainty-5

• Model selection error– Knowledge uncertainty of how to proceed

– Introduces a systematic, not a random error

– Can only be reduced with more knowledge

• Primarily considered in:– Ecosystem estimates

– Simulation models

– Synthetic efforts

– Example: Are tree stems• Cones? Neiloids? or Paraboloids?

Watershed 1 H. J. Andrews Experimental Forest

Before

Before burning

20 yrs after burning

30 yrs after burning

Measurement error

0.00

50.00

100.00

150.00

200.00

250.00

1975 1980 1985 1990 1995 2000 2005 2010

Ab

ove

gro

un

d b

iom

ass

(Mg

/ha)

Year of measurement

Biopak mean

-2 standard errors

+2 standard errors

mean relative error≈ 0.09%

measurement error± 2% per tree

N=3,000

Spatial Variation

0

50

100

150

200

250

1975 1980 1985 1990 1995 2000 2005 2010

Ab

ove

gro

un

d b

iom

ass

(Mg

/ha)

Year of measurement

Biopak mean

Biopak -2 SE

Biopak+2 SE

relative error goes from≈50 to ≈4% over timeN=138 plots

Relative Spatial Error

0

10

20

30

40

50

60

1975 1980 1985 1990 1995 2000 2005 2010

Re

lati

ve s

pat

ial e

rro

r (%

)

Year of measurement

BiopakJenkinsLutz

Model Parameter Error

0.00

50.00

100.00

150.00

200.00

250.00

1975 1980 1985 1990 1995 2000 2005 2010

Ab

ove

gro

nd

bio

mas

s (M

g/h

a)

Year of measurement

Biopak mean

Biopak- 2SE

Biopak +2SE

mean relative error ≈1.5%Assumed ±5% parameter variation n=3,000

Model Selection Error

0

50

100

150

200

250

1975 1980 1985 1990 1995 2000 2005 2010

Ab

ove

gro

un

d b

iom

ass

(Mg

/ha)

Year of measurement

Biopak mean

Lutz mean

Jenkins mean

relative error ≈ 10%N= 3

Combined Error

0.00

50.00

100.00

150.00

200.00

250.00

1975 1980 1985 1990 1995 2000 2005 2010

Ab

ove

gro

nd

bio

mas

s (M

g/h

a)

Year of measurement

Biopak meanBiopak- 2SEBiopak +2SELutz meanLutz -2SELutz +2SE

relative error declines from50 to 5%

175

235

Relative Source of Error Biopak

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

1980 1984 1988 1991 1995 2001 2007

Re

lati

ve e

rro

r %

Year of measurement

Model selection

Model parameter

Spatial

Measurement

Relative Source of Error Lutz

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

1980 1984 1988 1991 1995 2001 2007

Re

lati

ve e

rro

r %

Year of measurement

Model selection

Model parameter

Spatial

Measurement

How can we use uncertainty in synthesis science?

Which set of numbers differs?

• 10 versus 10.1

• 10 versus 100

Which set of numbers differs?

• 10 versus 10.1

• 10 versus 100

Which set of numbers differs?

• 10 versus 10.1

• 10 versus 100

Assess scientific progress

• A goal of science is to reduce uncertainty to the degree possible (we explain as much as we can)

• How do we know we are making progress if we do not honestly report uncertainty?

progress

Why Address Model Selection Error?

A

B

C

A

B

C

A

B B

A

B

C

A

B

C

Why Address Model Selection Error?

A

B

C

A

B

C

A

B B

A

B

C

A

B

C

Why Address Model Selection Error?

A

B

C

A

B

C

A

B B

A

B

C

A

B

C

Why Address Model Selection Error?

A

B

C

A

B

C

A

B B

A

B

C

A

B

C

Why Address Model Selection Error?

A

B

C

A

B

C

A

B B

A

B

C

A

B

C

Why Address Model Selection Error?

A

B

C

A

B

C

A

B B

A

B

C

A

B

C

Where does the uncertainty lie?And what do we do about it?

• Measurement-improve precision, accuracy, detection limits

• Natural variation-improve sampling design

• Model parameter-improve estimates of parameters

• Model selection-improve knowledge or use models that are truly general

Conclusions

• We need to start somewhere– We may not know everything, but that has always

been true

– Unknown unknowns that are unknowable

– We do know uncertainty is not zero and it is not infinite

• We need to develop:– ways to effectively estimate uncertainty

– standard guidelines of how to report and analyze

– publication expectations

Thanks to:

• Becky G. Fasth

• The QUEST team

• Ruth Yanai

• Everyone that collected the WS01 data

• NSF Andrews LTER; Quest RCN; Richardson Endowment

Example of Quantifying Uncertainty

• Carbon budget for WS01

• Old-growth Douglas-fir/western hemlock forest harvested in 1964-66

• Seeded and planted numerous times

• Repeated measurement of diameter at ground and breast height of tagged trees in 100 plus plots

• Status of trees (live, dead, ingrowth) also noted

How Can We Use Uncertainty in a Useful Way for Synthesis Science?

• Stop hiding uncertainty

• Stop being judgmental about it

• Start reporting the building blocks (e.g., measurement errors, model parameter errors, etc)

• Address model selection error fully