1 edition 1978 · evaluation as a result of the learning that occurs during the evaluation process....

90
1

Transcript of 1 edition 1978 · evaluation as a result of the learning that occurs during the evaluation process....

1

1st edition 1978

2nd edition 1986

3rd edition 1997

4th edition 2008

2

Research-based Approach

• Original follow-up study of the useof 20 federal health evaluations

• 40 years of research on use

3

4

5

Utilization-FocusedEvaluation (U-FE)

A decision-making framework forenhancing the utility and actualuse of evaluations.

6

U-FE begins with the premise thatevaluations should be judged bytheir utility and actual use.Therefore, evaluators shouldfacilitate the evaluation process anddesign any evaluation with carefulconsideration of how everythingthat will be done, from beginning toend, will affect use.

7

USE

• Take use seriously by evaluating use, thesource of our own accountability andongoing learning/professionaldevelopment

• Different from dissemination

• Different from producing reports

• Groundwork laid and expectations set atthe beginning

• Doesn’t happen naturally or automatically

8

9

In thebeginning…

U-FE Checklist

10

11

12

13

SPEAKING TRUTH TO POWER

14

Aboutthisbook

Previewthisbook

ShakeHandswiththeDevil

ByRoméoDallaire

This is a preview. Thetotal pages displayed

will be limited. L

15

resultSearchBooksresultoe9S6SgfGeneral RPP1PP1ACfU3U1E

16

“I can honestly say that not a day goes by when we don’t usethose evaluations in one way or another.”

Written by Mark M. Rogers and illustrated by Lawson Sworh

Beyond Token Use

U-FE Checklist

17

Exercise

Good idea that didn’t work out in practiceand how you came to know that it didn’twork out.

18

Laying the foundation for use:

Situation analysis and settingthe stage

19

20

The first step inUtilization-Focused Evaluation issituation analysis:assessing the readiness of aprogram, project or organizationto commit to and undertakeusing evaluation.

Step 1Assess and Build Program

and OrganizationalReadiness for Evaluation

21

Formally launching the evaluation:

Exercises to help assess and buildreadiness for evaluation –

and get the process started

22

Menu of Exercises to Assess and FacilitateStakeholder and Program Readiness

for Utilization-Focused Evaluation

1. Baselineassessment ofevaluation use:

How are datacurrently collectedbeing used? Howhave past evaluationsbeen used?

• Situation for whichthis exercise isparticularlyappropriate:

• Organizations alreadyengaged in someevaluation and/orongoing datacollection.

23

Menu of Exercises to Assess and FacilitateStakeholder and Program Readiness

for Utilization-Focused Evaluation

2. Baselineassociations withand perceptions ofevaluation:

“What comes to mindwhen you see theword EVALUATE?”

• Use with a group ofstakeholders broughttogether to launch anew evaluation effort;surfaces the“baggage” that peoplebring to the newinitiative from pastexperiences.

24

Menu of Exercises to Assess and FacilitateStakeholder and Program Readiness

for Utilization-Focused Evaluation

3. Create a positivevision forevaluation:

If evaluation was reallyuseful and actuallyused here, whatwould that look like?

• Use to help a group movefrom evaluation anxietyand resistance to focus onthe potential benefits ofevaluation. Createsreadiness for evaluationby focusing on use andcreating a shared groupunderstanding of andcommitment to use theevaluation.

25

Menu of Exercises to Assess and FacilitateStakeholder and Program Readiness

for Utilization-Focused Evaluation

4. Assess incentivesfor and barriers toreality testing andevaluation use intheir own programculture.

Once a group has ageneral sense ofevaluation’s potentialutility, this exercise takesthe next step of gettingconcrete about what willneed to occur with thisprogram context to makeevaluation useful.

26

Menu of Exercises to Assess and FacilitateStakeholder and Program Readiness

for Utilization-Focused Evaluation

5. Engender commitmentto reality-testing:

• Are you willing to take aclose look at whetherwhat you think ishappening in thisprogram is actuallyhappening, and whetherwhat you hope it isaccomplishing is actuallybeing accomplished?

These questions areuseful at the momentof commitment aftersome basicgroundwork has beenlaid. These questionsimplicitly ask thoseinvolved: Are youready to get serious?

27

GREAT MOMENTS IN EVALUATION DESIGN

“Just to be on the safe side, let’s look into evaluation models that don’tinvolve working with people.”

28

Baseline Question

How would you describe your currentorganization’s level of evaluation use?

1. High evaluation use: Regularly and visiblyuses evaluation findings to inform decisions

2. Moderate evaluation use: Occasionally usessome evaluation findings to inform somedecisions.

3. Low use: Seldom or never uses evaluationfindings to inform decisions

29

Baseline question # 2

To what extent are you doing the things you knowyou should be doing in your personal life basedon what you know? How would you describeyourself personally?

1. High information user – my behavior matchesmy knowledge

2. Moderate information user – much of mybehavior matches my knowledge, but I havesome major lapses.

3. Low information user – My behavior often fallsshort of what I know I should be doing. 30

31

32

Evaluation

and

Research:

Same or different?

And why?

33

Foundational Premise:

Research and

evaluation are

different

- and therefore…

Foundational Premise:

Research and evaluation are different

– and therefore…

Evaluated by different standards.

35

Evaluation Standards

Utility – ensure relevance & use

Feasibility – realistic, prudent,diplomatic & frugal

Propriety – ethical, legal, respectful

Accuracy – technically adequate todetermine merit or worth

Accountability -- metaevaluation

For the full list of Standards:

www.wmich.edu/evalctr/checklists/standardschecklist.htm36

Evaluation Models

U-FE is one among many….

37

38

Example

Caribbean Agricultural Extension Project

U.S. AID

European Development Bank

Caribbean Development Bank

University of the West Indies

Big Ten Universities (MUCIA)

CARDI

39

Overview

• 10 year project in 3 phases

• Phase 1, January 1980 to December 1982

• Mid-term and end-of-phase evaluationsplanned and budgeted

• 10 Caribbean countries

40

Utilization Question

When will evaluation results be needed tocontribute to the summative decisionabout whether to fund Phase 2?

When do you think?

41

When will evaluation results be needed tocontribute to the summative decision about

whether to fund Phase 2?

State Department budget to Congress

Caribbean Budget to State Department

Barbados Regional Office to Wash DCand Caribbean Region Office

Agriculture program to Caribbean

42

Evaluation Questions

• Are outstanding extension agents makinga significant contribution to improvedfarming for small farmers?

Answer “no”: End project

Answer “yes”: Consider nextquestion…

• If so, is it worth training more suchagricultural extension agents?

43

An ancient example

• Group exercise:

What lessons about making evaluationuseful do you extract from thisexample?

44

Goal of U-FE

Intended Use

by

Intended Users

45

Intended Evaluation Users

From…

Audiences to…

Stakeholders to…

Primary IntendedUsers

Connotative differences?46

Personal Factor

Critical success factors:

There are five key variables that areabsolutely critical in evaluation use. Theyare, in order of importance:

• People

–People

•People

–People

»PEOPLE48

Identify and Involve

Primary Intended Users

50

Intended Use

by

Intended Users

-----------------

Facilitating

Intended Use Options

Different Evaluation Purposes• For making judgments

Commonly called summativeevaluations:

• For improving programs

Commonly called formativeevaluations

• For ongoing development

Sometimes called developmentalevaluations

52

Lessons Learned Purpose

• Knowledge building

Meta-evaluation, lessonslearned, effective practices

53

Additional purpose distinctions

• Accountability

• Monitoring (M & E)

54

55

Tensions

Different intended uses serve differentpurposes and, typically,

different intended users.

Thus the need to FOCUS

and manage tensions

between and amongdifferent purposes.

56

Balancing Different Purposes

57

Premises of Utilization FocusedEvaluation

• No evaluation should go forward unless and untilthere are primary intended users who will usethe information that can be produced

• Primary intended users are involved in theprocess

• Evaluation is part of initial program design - Theprimary intended users want information to helpanswer a question or questions.

• Evaluator’s role is to help intended users clarifytheir purpose and objectives.

• Make implications for use part of every decisionthroughout the evaluation – the driving force ofthe evaluation process.

58

Important trend

• Capacity-building:

Evaluation capacity-building as apriority to support use

59

Basic evaluation literacy• Know the evaluation standards

• Know how to apply the standards in theactual conduct of evaluations

• Understand different potential uses andtheir implications methodologically andprocedurally

• Understand how to identify and work withprimary intended users

• Have evaluators with essential skils

60

Process Use

Process use refers to and is indicated by individualchanges in thinking and behavior, and programor organizational changes in procedures andculture, that occur among those involved inevaluation as a result of the learning that occursduring the evaluation process. Evidence ofprocess use is represented by the following kindof statement after an evaluation: "The impact onour program came not so much from the findingsbut from going through the thinking process thatthe evaluation required."

61

Process Uses

• Enhancing shared understandings

• Focusing programs: What gets measured getsdone

• Supporting and reinforcing the program

intervention, e.g., feedback for learning

• Capacity-building for those involved,

deepening evaluative thinking

• Program and organizational development,

e.g., evaluability assessments

62

New Direction

Infusing evaluative thinking as aprimary type of process use.

Capacity-building as an evaluationfocus of process use.

63

64

65

Some premises:• Evaluation is part of initial program design,

including conceptualizing the theory ofchange

• Evaluator’s role is to help users clarifytheir purpose, hoped-for results, andchange model.

• Evaluators can/should offer conceptualand methodological options.

• Evaluators can help by questioningassumptions.

• Evaluators can play a key role infacilitating evaluative thinking all along theway. 66

Utilization-Focused

Methods Decisions

STEP

68

Involving primary intended users inmethods decisions

• The myth that methods decisions areprimarily technical

• Balancing utility and accuracy

• Attending to situational and purpose-based credibility

69

Examples of methods options

• Data collection options

• Odd-even questionnaire items

• Sampling options

• Definitional issues

• Dosage issues

• Cohort designs

70

The Challenge:

Matching the evaluationdesign to the evaluation’spurpose, resources, andtimeline to optimize use.

72

Shaping of an issue over time

The morphing of the paradigmsdebate (qualitative vsquantitative) into the GoldStandard debate (randomizedcontrol trials as the alleged "goldstandard" for impact evaluation)

73

GOLD STANDARD:

METHODOLOGICALAPPROPRIATENESS

not

Methodologicalorthodoxy or rigidity

74

•No single design ormethod is universally“strongest”

•Multiple ways ofestablishing causality

•Dangerous to privilege onemethod, creates perverseincentives

75

76

77

78

TheJanusChallenge

In Roman mythology, Janus is the god of gates,doors, beginnings, endings, and time. Themonth of January is named in his honor. He isdepicted classically with two faces looking inopposite directions. One looks back at the pastyear while the other looks forward to the new,simultaneously peering into the future and thepast.

79

The challenge I faced in writing this bookwas not in looking simultaneouslybackwards and forward, but rather inportraying utilization-focused evaluation asa series of sequential steps while alsocapturing the complex nature of theutilization-focused process as non-linear,interactive and dynamic. 80

•4. Situation analysis (ongoing)• 3. Engage primary intended users (ongoing)

• 1 & 2. Assess organizational and evaluator readiness

• 16. Follow upwith users tofacilitate use

•14. Datapresented for user

engagement• Use

findings

• 15. Reportproduced

• 13. Gather data with ongoingattention to use.

•12. Simulate use of findings.

• 10. Negotiatemethods

• 9. Theory ofchange work

• 11. Methodsdebates

• 7. Prioritizeevaluationquestions

•Ongoing attention tointended uses by

intended users

•8. Check thatfundamental issues

are sufficientlyaddressed: goal

attainment,implementation,comparisons,

attribution

•5: Focusintended

findings uses

•6: Focus intendedprocess uses

•17. Utilization-Focused Metaevaluation

The Challenge:

Matching the evaluationdesign to the evaluation’spurpose, resources, andtimeline to optimize use.

83

84

85

Leaders as Primary IntendedUsers

Reality-Testing,

Results-Oriented,

Learning-Focused

Leadership

86

Evaluation Leadership Functions

1. Create and nurture a results-oriented,reality-testing culture.

2. Lead in deciding what outcomes to commit toand hold yourselves accountable for.

3. Make measurement of outcomes thoughtful,meaningful and credible.

4. Use the results -- and model for others serious

use of results.87

Summary Lessons on Useful Evaluation• Clearly identify primary intended users• Clearly identify primary intended uses

Goal: Intended use by intended users• Negotiate FOCUS -- get agreement on

criteria• Establish a clear ACTION framework• Distinguish empirical questions from

value questions• Select methods appropriate to the

question• Facilitate actual use of the findings

88

89

References

Utilization-Focused Evaluation, 4th ed,

Michael Quinn Patton

Sage Publications, 2008.

Essentials of Utilization-Focused Evaluation,Michael Quinn Patton, Sage, 2012

90