Results of Implementation and Testing Soils and Riparian – What Did We Learn?

13
Results of Implementation and Testing Soils and Riparian – What Did We Learn?

Transcript of Results of Implementation and Testing Soils and Riparian – What Did We Learn?

Page 1: Results of Implementation and Testing Soils and Riparian – What Did We Learn?

Results of Implementation and Testing Soils and Riparian –

What Did We Learn?

Page 2: Results of Implementation and Testing Soils and Riparian – What Did We Learn?

Purpose of Routine Evaluations

• Unclear

• It is our understanding that this was driven by the need to assess results under results-based legislation, ie FRPA

• The underlying purpose common to most evaluations of forest resources is to learn about forest practices.

Page 3: Results of Implementation and Testing Soils and Riparian – What Did We Learn?

Uses of Indicators

• Compliance and Effectiveness Audits

• Inspections

• Long-term Research

• Effectiveness Monitoring

• Etc.

Page 4: Results of Implementation and Testing Soils and Riparian – What Did We Learn?

What Did We Do?

• Compliance and Effectiveness Audits• Testing the indicators was not the objective of

the audits• We used the indicators as the basis for a set of

commonly accepted indictors of effective soil conservation and stream management.

• Can’t talk about the audit results, as not yet reported

• Can discuss the process and use of the indicators in the audits

Page 5: Results of Implementation and Testing Soils and Riparian – What Did We Learn?

Use of Indicators in Audits

• Audit relates to the forest practices (versus streams or soils)

• Indicators used to establish common understanding of effectiveness of forest practices impacting soils and streams

• This required some adjustments to the draft indicators to ensure that the use of the indicators was informative about the underlying forest practices. (handout)

Page 6: Results of Implementation and Testing Soils and Riparian – What Did We Learn?

Use of Indicators Cont.

• We also developed audit programs that incorporated the indicators to allow audit analysis. This provides the basis of determining whether or not an indicator is achieved, and why.

• This is an important point. The technical design of an evaluation is critical to proper application of the indicators.

• Development teams should consider the type, purpose and objectives of “routine evaluations” in developing indicators.

Page 7: Results of Implementation and Testing Soils and Riparian – What Did We Learn?

What did we Learn? (Key Messages)

• The development teams developed indicators and methodologies; the audit teams used the indicators, not the methodologies

• Would have been beneficial if the development teams had involvement of persons with varied expertise that will be using the indicators.

Page 8: Results of Implementation and Testing Soils and Riparian – What Did We Learn?

What did we Learn?(Cont.)

• Compliance audit assessment at same time as the effectiveness assessment works well – provides linkage to the forest practices

• Implementation teams need to incorporate the necessary expertise – we found substantial benefit in having one of the development team members on each audit team

• Site level field assessments were necessary to use the indicators (for most assessments)

• Using professional judgement inherent to using indicators– When to apply and take detailed measurements– Interpreting results

Page 9: Results of Implementation and Testing Soils and Riparian – What Did We Learn?

What did we Learn? (Cont.)

• Indicators need to be general enough to facilitate the use of professional judgement

• Indicators, as developed, were “commonly accepted” indicators.

• Each indicator should be supported by a short rationale

Page 10: Results of Implementation and Testing Soils and Riparian – What Did We Learn?

What did we Learn? (Cont.)

• Recommend that the indicators be tested using different routine evaluations e.g. compliance and enforcement inspections

• As part of training, having indicator developers scientists and implementers in the field together prior to starting audits was very beneficial

• Can not evaluate achievement of indicators without the collection of data (checklists developed).

• Different forms of data for different evaluations

Page 11: Results of Implementation and Testing Soils and Riparian – What Did We Learn?

What did we Learn? Cont.

• Indicators we used were “accepted” as common indicators (no argument from auditees – recognize that audits are not complete)

• The combination of using scientists, foresters, experts, auditors, etc. works very well.

• Licensees, stewardship groups should be involved in indicator development as well.

Page 12: Results of Implementation and Testing Soils and Riparian – What Did We Learn?

Lessons Learned

• The model of MOF developing indicators and board auditors developing the audit program worked adequately– Stronger relationship between the indicators

and forest practices needed– Indicators were essentially premised on best

management practices

Page 13: Results of Implementation and Testing Soils and Riparian – What Did We Learn?

END