Harmon, Uncertainty analysis: An evaluation metric for synthesis science

34
Uncertainty Analysis: An Evaluation Metric for Synthesis Science Mark E. Harmon Richardson Chair and Professor Department of Forest Ecosystems and Society Oregon State University ESA 2013 Organized Session

Transcript of Harmon, Uncertainty analysis: An evaluation metric for synthesis science

Page 1: Harmon, Uncertainty analysis: An evaluation metric for synthesis science

Uncertainty Analysis: An Evaluation Metric for Synthesis Science

Mark E. Harmon Richardson Chair and Professor

Department of Forest Ecosystems and SocietyOregon State University

ESA 2013 Organized Session

Page 2: Harmon, Uncertainty analysis: An evaluation metric for synthesis science

Two Complementary Sides to Science

• Reductionist– Reduce down

– Simplify

– Control confounding factors

– Additive to degree possible

• Synthesis– Build up

– Address Complexity

– Retain confounding factors

– Interactive, whole more than

sum of parts (?)

?

Page 3: Harmon, Uncertainty analysis: An evaluation metric for synthesis science

Sources of Uncertainty-1

• Measurement error (experimental error)

• Natural variation in space and time

• Model parameter error

• Model selection error

Page 4: Harmon, Uncertainty analysis: An evaluation metric for synthesis science

Sources of Uncertainty-2

• Measurement error (experimental error)

– Accuracy: how close to the truth?

– Precision: how repeatable?

– Detection limits: how small?

• Primarily considered in:

– Laboratory analyses

– Climate, hydrologic, ecophysiologyinstrumentation

Page 5: Harmon, Uncertainty analysis: An evaluation metric for synthesis science

Sources of Uncertainty-3

• Natural variation in space and time– Improve estimates of mean and variation via

sample design

– Cannot be completely eliminated

• Primarily considered in:– Field sampling

– Field experiments

– Statistical tests

Page 6: Harmon, Uncertainty analysis: An evaluation metric for synthesis science

Sources of Uncertainty-4

• Model parameter error– Simple to complex conversions of one variable to

another requires a model

– Uncertainty of parameter value

– Can be reduced but not eliminated completely

• Primarily considered in:– Ecosystem estimates

– Contrast these conversions

– BA= Π*DBH2/4 vs Biomass=B1*DBHB2

Page 7: Harmon, Uncertainty analysis: An evaluation metric for synthesis science

Sources of Uncertainty-4

• Model parameter error– Simple to complex conversions of one variable to

another requires a model

– Uncertainty of parameter value

– Can be reduced but not eliminated completely

• Primarily considered in:– Ecosystem estimates

– Contrast these conversions

– BA= Π*DBH2/4 vs Biomass=B1*DBHB2

Page 8: Harmon, Uncertainty analysis: An evaluation metric for synthesis science

Sources of Uncertainty-5

• Model selection error– Knowledge uncertainty of how to proceed

– Introduces a systematic, not a random error

– Can only be reduced with more knowledge

• Primarily considered in:– Ecosystem estimates

– Simulation models

– Synthetic efforts

– Example: Are tree stems• Cones? Neiloids? or Paraboloids?

Page 9: Harmon, Uncertainty analysis: An evaluation metric for synthesis science

Sources of Uncertainty-5

• Model selection error– Knowledge uncertainty of how to proceed

– Introduces a systematic, not a random error

– Can only be reduced with more knowledge

• Primarily considered in:– Ecosystem estimates

– Simulation models

– Synthetic efforts

– Example: Are tree stems• Cones? Neiloids? or Paraboloids?

Page 10: Harmon, Uncertainty analysis: An evaluation metric for synthesis science

Watershed 1 H. J. Andrews Experimental Forest

Before

Before burning

20 yrs after burning

30 yrs after burning

Page 11: Harmon, Uncertainty analysis: An evaluation metric for synthesis science

Measurement error

0.00

50.00

100.00

150.00

200.00

250.00

1975 1980 1985 1990 1995 2000 2005 2010

Ab

ove

gro

un

d b

iom

ass

(Mg

/ha)

Year of measurement

Biopak mean

-2 standard errors

+2 standard errors

mean relative error≈ 0.09%

measurement error± 2% per tree

N=3,000

Page 12: Harmon, Uncertainty analysis: An evaluation metric for synthesis science

Spatial Variation

0

50

100

150

200

250

1975 1980 1985 1990 1995 2000 2005 2010

Ab

ove

gro

un

d b

iom

ass

(Mg

/ha)

Year of measurement

Biopak mean

Biopak -2 SE

Biopak+2 SE

relative error goes from≈50 to ≈4% over timeN=138 plots

Page 13: Harmon, Uncertainty analysis: An evaluation metric for synthesis science

Relative Spatial Error

0

10

20

30

40

50

60

1975 1980 1985 1990 1995 2000 2005 2010

Re

lati

ve s

pat

ial e

rro

r (%

)

Year of measurement

BiopakJenkinsLutz

Page 14: Harmon, Uncertainty analysis: An evaluation metric for synthesis science

Model Parameter Error

0.00

50.00

100.00

150.00

200.00

250.00

1975 1980 1985 1990 1995 2000 2005 2010

Ab

ove

gro

nd

bio

mas

s (M

g/h

a)

Year of measurement

Biopak mean

Biopak- 2SE

Biopak +2SE

mean relative error ≈1.5%Assumed ±5% parameter variation n=3,000

Page 15: Harmon, Uncertainty analysis: An evaluation metric for synthesis science

Model Selection Error

0

50

100

150

200

250

1975 1980 1985 1990 1995 2000 2005 2010

Ab

ove

gro

un

d b

iom

ass

(Mg

/ha)

Year of measurement

Biopak mean

Lutz mean

Jenkins mean

relative error ≈ 10%N= 3

Page 16: Harmon, Uncertainty analysis: An evaluation metric for synthesis science

Combined Error

0.00

50.00

100.00

150.00

200.00

250.00

1975 1980 1985 1990 1995 2000 2005 2010

Ab

ove

gro

nd

bio

mas

s (M

g/h

a)

Year of measurement

Biopak meanBiopak- 2SEBiopak +2SELutz meanLutz -2SELutz +2SE

relative error declines from50 to 5%

175

235

Page 17: Harmon, Uncertainty analysis: An evaluation metric for synthesis science

Relative Source of Error Biopak

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

1980 1984 1988 1991 1995 2001 2007

Re

lati

ve e

rro

r %

Year of measurement

Model selection

Model parameter

Spatial

Measurement

Page 18: Harmon, Uncertainty analysis: An evaluation metric for synthesis science

Relative Source of Error Lutz

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

1980 1984 1988 1991 1995 2001 2007

Re

lati

ve e

rro

r %

Year of measurement

Model selection

Model parameter

Spatial

Measurement

Page 19: Harmon, Uncertainty analysis: An evaluation metric for synthesis science

How can we use uncertainty in synthesis science?

Page 20: Harmon, Uncertainty analysis: An evaluation metric for synthesis science

Which set of numbers differs?

• 10 versus 10.1

• 10 versus 100

Page 21: Harmon, Uncertainty analysis: An evaluation metric for synthesis science

Which set of numbers differs?

• 10 versus 10.1

• 10 versus 100

Page 22: Harmon, Uncertainty analysis: An evaluation metric for synthesis science

Which set of numbers differs?

• 10 versus 10.1

• 10 versus 100

Page 23: Harmon, Uncertainty analysis: An evaluation metric for synthesis science

Assess scientific progress

• A goal of science is to reduce uncertainty to the degree possible (we explain as much as we can)

• How do we know we are making progress if we do not honestly report uncertainty?

progress

Page 24: Harmon, Uncertainty analysis: An evaluation metric for synthesis science

Why Address Model Selection Error?

A

B

C

A

B

C

A

B B

A

B

C

A

B

C

Page 25: Harmon, Uncertainty analysis: An evaluation metric for synthesis science

Why Address Model Selection Error?

A

B

C

A

B

C

A

B B

A

B

C

A

B

C

Page 26: Harmon, Uncertainty analysis: An evaluation metric for synthesis science

Why Address Model Selection Error?

A

B

C

A

B

C

A

B B

A

B

C

A

B

C

Page 27: Harmon, Uncertainty analysis: An evaluation metric for synthesis science

Why Address Model Selection Error?

A

B

C

A

B

C

A

B B

A

B

C

A

B

C

Page 28: Harmon, Uncertainty analysis: An evaluation metric for synthesis science

Why Address Model Selection Error?

A

B

C

A

B

C

A

B B

A

B

C

A

B

C

Page 29: Harmon, Uncertainty analysis: An evaluation metric for synthesis science

Why Address Model Selection Error?

A

B

C

A

B

C

A

B B

A

B

C

A

B

C

Page 30: Harmon, Uncertainty analysis: An evaluation metric for synthesis science

Where does the uncertainty lie?And what do we do about it?

• Measurement-improve precision, accuracy, detection limits

• Natural variation-improve sampling design

• Model parameter-improve estimates of parameters

• Model selection-improve knowledge or use models that are truly general

Page 31: Harmon, Uncertainty analysis: An evaluation metric for synthesis science

Conclusions

• We need to start somewhere– We may not know everything, but that has always

been true

– Unknown unknowns that are unknowable

– We do know uncertainty is not zero and it is not infinite

• We need to develop:– ways to effectively estimate uncertainty

– standard guidelines of how to report and analyze

– publication expectations

Page 32: Harmon, Uncertainty analysis: An evaluation metric for synthesis science

Thanks to:

• Becky G. Fasth

• The QUEST team

• Ruth Yanai

• Everyone that collected the WS01 data

• NSF Andrews LTER; Quest RCN; Richardson Endowment

Page 33: Harmon, Uncertainty analysis: An evaluation metric for synthesis science

Example of Quantifying Uncertainty

• Carbon budget for WS01

• Old-growth Douglas-fir/western hemlock forest harvested in 1964-66

• Seeded and planted numerous times

• Repeated measurement of diameter at ground and breast height of tagged trees in 100 plus plots

• Status of trees (live, dead, ingrowth) also noted

Page 34: Harmon, Uncertainty analysis: An evaluation metric for synthesis science

How Can We Use Uncertainty in a Useful Way for Synthesis Science?

• Stop hiding uncertainty

• Stop being judgmental about it

• Start reporting the building blocks (e.g., measurement errors, model parameter errors, etc)

• Address model selection error fully