Setting the Stage: Workshop Framing and Crosscutting Issues

Post on 24-Feb-2016

33 views 0 download

Tags:

description

Setting the Stage: Workshop Framing and Crosscutting Issues. Simon Hearn, ODI. Evaluation Methods for Large-Scale, Complex, Multi-National Global Health Initiatives, Institute of Medicine 7-8 January, London, UK. What do we mean by complex interventions?. The nature of the intervention: - PowerPoint PPT Presentation

Transcript of Setting the Stage: Workshop Framing and Crosscutting Issues

Setting the Stage: Workshop Framing

and Crosscutting Issues

Simon Hearn, ODI

Evaluation Methods for Large-Scale, Complex, Multi-National Global Health Initiatives, Institute of Medicine

7-8 January, London, UK

The nature of the intervention:1. Focus of objectives2. Governance3. Consistency of implementation

What do we mean by complex interventions?

How it works:4. Necessariness5. Sufficiency6. Change trajectory

Photo: Les Chatfield - http://www.flickr.com/photos/elsie/

What are the challenges of evaluating complex interventions?

Describing what is being implementedGetting data about impactsAttributing impacts to a particular programme

Photo: Les Chatfield - http://www.flickr.com/photos/elsie/

Wikipedia: Evaluation Methods

Why a framework is needed

Image: Simon Kneebone. http://simonkneebone.com/

Why a framework is needed

The Rainbow Framework

DEFINE what is to be evaluated

Why do we need to start with a clear definition?

Photo: Hobbies on a Budget / Flickr

1. Develop initial description

2. Develop program theory or logic model

3. Identify potential unintended results

Options for representing logic models

Pipeline / results chain

Logical framework

Outcomes hierarchy / theory of change

Realist Matrix

FRAME what is to be evaluated

Source: Hobbies on a Budget / Flickr

Frame Decision Make Decision

Frame Evaluation Design Evaluation

1. Identify primary intended users

2. Decide purpose(s)3. Specify key evaluation

questions4. Determine what

‘success’ looks like

DESCRIBE what happened

1. Sample2. Use measures,

indicators or metrics3. Collect and/or retrieve

data4. Manage data5. Combine qualitative

and quantitative data6. Analyze data7. Visualize data

Combine qualitative and quantitative data

EnrichExamineExplainTriangulate

ParallelSequential

ComponentIntegrated

UNDERSTAND CAUSES of outcomes and impacts

Outcomes Impacts

As a profession, we often either oversimplify causationor we overcomplicate it!

“In my opinion, measuring attribution is critical, and we can't do that unless we use control groups to compare them to.”

Comment in an expert discussion on The Guardian online, May 2013

1. Check that the results support causal attribution

2. Compare results to the counterfactual

3. Investigate possible alternative explanations

SYNTHESIZE data from one or more evaluations

Was it good?Did it work?

Was it effective?

For whom did it work?

In what ways did it work?

Was it value for money?Was it cost-effective?

Did it succeed in terms of the Triple Bottom Line?

24

How do we synthesize diverse evidence about performance?

All intended impacts achieved

Some intended impacts achieved

No negative impacts

Overall synthesis GOOD ?? ?? BAD

1. Synthesize data from a single evaluation

2. Synthesize data across evaluations

3. Generalize findings

REPORT and SUPPORT USE of findings

I can honestly say that not a day goes by when we don’t

use those evaluations in one

way or another

1. Identify reporting requirements

2. Develop reporting media

3. Ensure accessibility4. Develop

recommendations5. Support use

MANAGE your evaluation

1. Understand and engage with stakeholders 

2. Establish decision making processes

3. Decide who will conduct the evaluation

4. Determine and secure resources5. Define ethical and quality

evaluation standards 6. Document management

processes and agreements7. Develop evaluation plan or

framework8. Review evaluation 9. Develop evaluation capacity

DESCRIBEUNDERSTAND CAUSESSYNTHESIZEREPORT & SUPPORT USE

Descriptive Questions – Was the policy implemented as planned?

Causal questions – Did the policy change contribute to improved health outcomes?

Synthesis questions – Was the policy overall a success?

Action questions – What should we do?

Making decisionsLook at type of questions

Making decisionsCompare pros and cons

Participant Questionnaire

Key Informant Interviews Project Records

Observation of program implementation

KEQ1 What was the quality of implementation?

✔ ✔ ✔ ✔

KEQ2 To what extent were the program objectives met?

✔ ✔ ✔

KEQ3 What other impacts did the program have?

✔ ✔

KEQ4 How could the program be improved?

✔ ✔ ✔

Making decisionsCreate an evaluation matrix

www.betterevaluation.org

Examples

Descriptions

ToolsGuides

Comments

R & D

Documenting

Sharing

Events ACTIVITIES

WEBSITE

Founding Partners

Financial Supporters

For more information:www.betterevaluation.org/start_here