IMPACT EVALUATION 1 Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa...

39
1 IMPACT EVALUATION Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Arianna Legovini Lead, Africa Impact Evaluation Lead, Africa Impact Evaluation Initiative Initiative

Transcript of IMPACT EVALUATION 1 Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa...

Page 1: IMPACT EVALUATION 1 Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative.

1

IMPACT EVALUATION

Impact Evaluation for Evidence-Based Policy MakingArianna LegoviniArianna LegoviniLead, Africa Impact Evaluation Lead, Africa Impact Evaluation

InitiativeInitiative

Page 2: IMPACT EVALUATION 1 Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative.

2

IMPACT EVALUATION

Answer Three Questions

•Why is evaluation valuable?

• What makes a good impact evaluation?

• How to implement evaluation?

Page 3: IMPACT EVALUATION 1 Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative.

3

IMPACT EVALUATION

IE Answers: How do we turn this teacher…

Page 4: IMPACT EVALUATION 1 Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative.

4

IMPACT EVALUATION

…into this teacher?

Page 5: IMPACT EVALUATION 1 Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative.

5

IMPACT EVALUATION

Why Evaluate?

• Need evidence on what works– Allocate limited budget– Fiscal accountability

• Improve program/policy overtime– Operational research– Managing by results

• Information key to sustainability– Negotiating budgets – Informing constituents and managing press– Informing donors

Page 6: IMPACT EVALUATION 1 Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative.

6

IMPACT EVALUATION

Traditional M&E and Impact Evaluation

• monitoring to track implementation efficiency (input-output)

INPUTS OUTCOMESOUTPUTS

MONITOR EFFICIENCY

EVALUATE EFFECTIVENESS

$$$

BEHAVIOR

• impact evaluation to measure effectiveness (output-outcome)

Page 7: IMPACT EVALUATION 1 Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative.

7

IMPACT EVALUATION

Question types and methods

• Process Evaluation / Monitoring:

Descriptive Descriptive analysisanalysis

Causal Causal analysisanalysis

▫What was the effect of the program on outcomes?▫How would outcomes change under alternative program designs?▫Does the program impact people differently (e.g. females, poor, minorities)?▫Is the program cost-effective?

▫Is program being implemented efficiently?▫Is program targeting the right population?▫Are outcomes moving in the right direction?• Impact Evaluation:

Page 8: IMPACT EVALUATION 1 Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative.

8

IMPACT EVALUATION

Which can be answered by traditional M&E and which by

IE?• Are books being delivered as planned?

• Does de-worming increase school attendance?

• What is the correlation between enrollment and school quality?

• Does the decentralized school management lead to an increase in learning achievement?

M&E

IE

M&E

IE

Page 9: IMPACT EVALUATION 1 Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative.

9

IMPACT EVALUATION

Types of Impact Evaluation

• Efficacy: – Proof of Concept– Pilot under ideal conditions

• Effectiveness:– At scale– Normal circumstances & capabilities– Lower or higher impact?– Higher or lower costs?

Page 10: IMPACT EVALUATION 1 Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative.

10

IMPACT EVALUATION

So, use impact evaluation to….

• Test innovations• Scale up what works (e.g. de-worming)• Cut/change what does not (e.g. HIV counseling)• Measure effectiveness of programs (e.g. JTPA )• Find best tactics to e.g. change people’s

behavior (e.g. come to the clinic)• Manage expectations

e.g. PROGRESA/OPORTUNIDADES (Mexico)– Transition across presidential terms– Expansion to 5 million households– Change in benefits – Battle with the press

Page 11: IMPACT EVALUATION 1 Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative.

11

IMPACT EVALUATION

Next question please

• Why is evaluation valuable?

• What makes a good impact evaluation?

• How to implement evaluation?

Page 12: IMPACT EVALUATION 1 Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative.

12

IMPACT EVALUATION

Assessing impact

• examples – How much do girl scholarships increase

school enrollment?– What is the level of beneficiary’s learning

achievement with program compared to without program?

• Compare same individual with & without programs at the same point in time

• Never observe same individual with and without program at same point in time

Page 13: IMPACT EVALUATION 1 Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative.

13

IMPACT EVALUATION

Solving the evaluation problem

• Counterfactual: what would have happened without the program

• Need to estimate counterfactual– i.e. find a control or comparison group

• Counterfactual Criteria– Treated & counterfactual groups have

identical initial characteristics on average, – Only reason for the difference in

outcomes is due to the intervention

Page 14: IMPACT EVALUATION 1 Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative.

14

IMPACT EVALUATION

2 “Counterfeit” Counterfactuals

• Before and after:– Same individual before the

treatment

• Non-Participants:– Those who choose not to enroll in

program– Those who were not offered the

program

Page 15: IMPACT EVALUATION 1 Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative.

15

IMPACT EVALUATION

Before and After Example

• Food Aid– Compare mortality before and

after– Find increase in mortality– Did the program fail?– “Before” normal year, but

“after” famine year– Cannot separate (identify) effect

of food aid from effect of drought

Page 16: IMPACT EVALUATION 1 Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative.

16

IMPACT EVALUATION

Before and After

• Compare Y before and after intervention

B Before-after counterfactual

A-B Estimated impact

• Control for time varying factors

C True counterfactual

A-C True impact A-B is under-estimated

Time

Y

AfterBefore

A

B

C

t-1 t

Treatment

B

Page 17: IMPACT EVALUATION 1 Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative.

17

IMPACT EVALUATION

Non-Participants….

• Compare non-participants to participants

• Counterfactual: non-participant outcomes

• Problem: why did they not participate?

Page 18: IMPACT EVALUATION 1 Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative.

18

IMPACT EVALUATION

Exercise: Why participants and non-participants differ?

• Children who come to school and children who do not?

• Communities that applied for funds for a new classroom and communities that did not?

• Children who received scholarships and children who did not?

Access to school

Poorer

Unmet demand

More organized community

Achievement

Poverty

Gender

Page 19: IMPACT EVALUATION 1 Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative.

19

IMPACT EVALUATION

Literacy program example

• Treatment offered• Who signs up?

– Those who are illiterate

• Have lower education than those who do not sign up

• Educated people are a poor estimate of counterfactual

Page 20: IMPACT EVALUATION 1 Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative.

20

IMPACT EVALUATION

What's wrong?

• Selection bias: People choose to participate for specific reasons

• Many times reasons are directly related to the outcome of interest

• Cannot separately identify impact of the program from these other factors/reasons

Page 21: IMPACT EVALUATION 1 Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative.

21

IMPACT EVALUATION

Program placement example

• Government offers school inputs program to schools with low infrastructure

• Compare achievement in schools offered program to achievement in schools not offered

• Program targeted based on lack of inputs, so– Treatments have low achievement– Counterfactuals have high achievement

• Cannot separately identify program impact from school targeting criteria

Page 22: IMPACT EVALUATION 1 Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative.

22

IMPACT EVALUATION

Need to know…

• Why some get program and others do not• How some get into treatment and other in

control group

• If reasons correlated with outcome – cannot identify/separate program impact from– other explanations of differences in outcomes

• The process by which data is generated

Page 23: IMPACT EVALUATION 1 Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative.

23

IMPACT EVALUATION

Possible Solutions…

• Guarantee comparability of treatment and control groups

• ONLY remaining difference is intervention

• In this workshop we will consider– Experimental design/randomization– Quasi-experiments

• Regression Discontinuity• Double differences

– Instrumental Variables

Page 24: IMPACT EVALUATION 1 Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative.

24

IMPACT EVALUATION

These solutions all involve…

• Randomization– Give all equal chance of being in

control or treatment groups– Guarantees that all

factors/characteristics will be on average equal between groups

– Only difference is the intervention

• If not, need transparent & observable criteria for who is offered program

Page 25: IMPACT EVALUATION 1 Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative.

25

IMPACT EVALUATION

The Last Question

• Why is evaluation valuable?

• What makes a good impact evaluation?

•How to implement evaluation?

Page 26: IMPACT EVALUATION 1 Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative.

26

IMPACT EVALUATION

Implementation Issues

• Political economy

• Policy context

• Finding a good control – Retrospective versus prospective

designs– Making the design compatible with

operations– Ethical Issues

• Relationship to “results” monitoring

Page 27: IMPACT EVALUATION 1 Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative.

27

IMPACT EVALUATION

Political Economy

• What is the policy purpose?– In USA test innovations to national

policy, defend budget– In RSA answer electorate– In Mexico allocate budget to poverty

programs– In IDA country pressure to demonstrate

aid effectiveness and scale up– In poor country hard constraints and

ambitious targets: how to reach those targets?

Page 28: IMPACT EVALUATION 1 Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative.

28

IMPACT EVALUATION

Evidence culture and incentives for change

• Cultural shift– From retrospective evaluation

Look back and judge

– To prospective evaluationDecide what need to learnExperiment with alternativesMeasure and informAdopt better alternatives overtime

• Change in incentives– Rewards for changing programs that do not

work– Rewards for generating knowledge– Separating job performance from knowledge

generation

Page 29: IMPACT EVALUATION 1 Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative.

29

IMPACT EVALUATION

The Policy Context

• Address policy-relevant questions:– What policy questions need answers?– What outcomes answer those questions?– What indicators measures outcomes?– How much of a change in the outcomes

would determine success?

• Example: teacher performance-based pay– Scale up pilot?– Criteria: Need at least a 10% increase in

test scores with no change in unit costs

Page 30: IMPACT EVALUATION 1 Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative.

30

IMPACT EVALUATION

Opportunities for good designs

• Use opportunities to generate good control groups

• Most programs cannot deliver benefits to all those eligible– Budgetary limitations:

• Eligible who get it are potential treatments• Eligible who do not are potential controls

– Logistical limitations:• Those who go first are potential treatments• Those who go later are potential controls

Page 31: IMPACT EVALUATION 1 Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative.

31

IMPACT EVALUATION

Who gets the program?

• Eligibility criteria– Are benefits targeted?– How are they targeted?– Can we rank eligible's priority?– Are measures good enough for fine rankings?

Who goes first? • Roll out

– Equal chance to go first, second, third?

Page 32: IMPACT EVALUATION 1 Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative.

32

IMPACT EVALUATION

Ethical Considerations

• Do not delay benefits: Rollout based on budget/administrative constraints

• Equity: equally deserving beneficiaries deserve an equal chance of going first

• Transparent & accountable method

– Give everyone eligible an equal chance

– If rank based on some criteria, then criteria should be quantitative and public

Page 33: IMPACT EVALUATION 1 Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative.

33

IMPACT EVALUATION

Retrospective Designs

• Hard to find good control groups– Must live with arbitrary or unobservable

allocation rules• Administrative data

– good enough to reflect program was implemented as described

• Need pre-intervention baseline survey – On both controls and treatments– With covariates to control for initial

differences

• Without baseline difficult to use quasi-experimental methods

Page 34: IMPACT EVALUATION 1 Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative.

34

IMPACT EVALUATION

Manage for results

• Retrospective evaluation cannot be used to manage for results

• Use resources wisely: do prospective evaluation design– Better methods– More tailored policy questions – Precise estimates – Timely feedback and program changes– Improve results on the ground

Page 35: IMPACT EVALUATION 1 Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative.

35

IMPACT EVALUATION

Monitoring Systems

• Projects/programs regularly collect data for management purposes

• Typical content– Lists of beneficiaries– Distribution of benefits– Expenditures– Outputs– Ongoing process evaluation

• Information is needed for impact evaluation

Page 36: IMPACT EVALUATION 1 Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative.

36

IMPACT EVALUATION

Evaluation uses administrative information

to:

• Verify who is beneficiary• When started• What benefits were actually delivered

Necessary condition for program to have an impact:

• benefits need to get to targeted beneficiaries

Page 37: IMPACT EVALUATION 1 Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative.

37

IMPACT EVALUATION

Improve use of administrative data for IE

• Program monitoring data usually only collected in areas where active

– Collect baseline for control areas as well

• Very cost-effective as little need for additional special surveys

– Add a couple of outcome indicators

• Most IE’s use only monitoring data

Page 38: IMPACT EVALUATION 1 Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative.

38

IMPACT EVALUATION

Overall Messages

• Impact evaluation useful for– Validating program design– Adjusting program structure– Communicating to finance ministry

& civil society

• A good evaluation design requires estimating the counterfactual– What would have happened to beneficiaries

if had not received the program– Need to know all reasons why beneficiaries

got program & others did not

Page 39: IMPACT EVALUATION 1 Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative.

39

IMPACT EVALUATION

Design Messages

• Address policy questions – Interesting is what government

needs and will use

• Stakeholder buy-in• Easiest to use prospective

designs• Good monitoring systems &

administrative data can improve IE and lower costs