Evaluation 101 Laura Pejsa Goff Pejsa & Associates MESI 2014.

39
Evaluation 101 Laura Pejsa Goff Pejsa & Associates MESI 2014

Transcript of Evaluation 101 Laura Pejsa Goff Pejsa & Associates MESI 2014.

Page 1: Evaluation 101 Laura Pejsa Goff Pejsa & Associates MESI 2014.

Evaluation 101

Laura Pejsa

Goff Pejsa & Associates

MESI 2014

Page 2: Evaluation 101 Laura Pejsa Goff Pejsa & Associates MESI 2014.

ObjectivesGain a greater understanding of evaluation and

evaluative thinking

Learn about some practical approaches & get familiar with some tools to use

 Have an opportunity to apply your learning directly to a

real world case

Page 3: Evaluation 101 Laura Pejsa Goff Pejsa & Associates MESI 2014.

Session Outline• Introductions / Intro to the day

• Grounding definitions & terms

• Understanding “programs” (purpose & logic)

• Evaluative thinking and the evaluation process

• Strategies for making evaluation desirable & usable

• Debrief, questions, & close

Page 4: Evaluation 101 Laura Pejsa Goff Pejsa & Associates MESI 2014.

Metaphors: Your Ideas about Evaluation

• Think of one object that represents your ideas and/or feelings about evaluation• Prepare to explain your choice• Share your with the person sitting next to

you and notice common themes• Prepare to share your common themes with

the group.

Page 5: Evaluation 101 Laura Pejsa Goff Pejsa & Associates MESI 2014.

5

E-VALU-ation

"Value" is the root word of evaluation

Evaluation involves making value judgments, according to many in the field

Page 6: Evaluation 101 Laura Pejsa Goff Pejsa & Associates MESI 2014.

Traditional definition: Michael Scriven

(from Michael Scriven, 1967, and the earlier Program Evaluation

Standards) "The systematic

determination of

the merit, worth (or value)

of an object”6

Page 7: Evaluation 101 Laura Pejsa Goff Pejsa & Associates MESI 2014.

Important concepts in this definition

• SYSTEMATIC means that evaluators use explicit rules and procedures to make determinations

•MERIT is the absolute or intrinsic value of an object

•WORTH is the relative or extrinsic value of an object in a given context

7

Page 8: Evaluation 101 Laura Pejsa Goff Pejsa & Associates MESI 2014.

An Alternative Definition: Michael Quinn Patton

Systematic collection of information about the activities, characteristics, and results of programs to (1) to make judgments about the program, (2) improve or further develop program effectiveness, (3) inform decisions, and/or (4) increase understanding.

Done for and with specific intended primary users for specific, intended uses.

Page 9: Evaluation 101 Laura Pejsa Goff Pejsa & Associates MESI 2014.

Commonalities among definitions

• Evaluation is a systematic process• Evaluation involves collecting data• Evaluation is a process for enhancing knowledge

and decision making• Evaluation use is implicit or explicit

Russ-Eft & Preskill (2009, p. 4)

9

Page 10: Evaluation 101 Laura Pejsa Goff Pejsa & Associates MESI 2014.

Discussion: Why Do Evaluation?• What are the things we might gain from engaging in

evaluation/an evaluative process?

• Why is it in our interest to do it?

• Why is it in the interest of the people we serve to do it?

• What are the benefits?

Page 11: Evaluation 101 Laura Pejsa Goff Pejsa & Associates MESI 2014.

From the textbooks… evaluation purposes

• Accreditation• Accountability• Goal attainment• Consumer

protection• Needs

assessment

• Object improvement• Understanding

or support• Social change• Decision making

Page 12: Evaluation 101 Laura Pejsa Goff Pejsa & Associates MESI 2014.

One basic distinction… Internal vs. External

12

INTERNAL evaluation

Conducted by program employees

Plus side: Knowledge of program

Minus side: Potential bias and influence

Page 13: Evaluation 101 Laura Pejsa Goff Pejsa & Associates MESI 2014.

EXTERNAL evaluation

•Conducted by outsiders,

often for a fee

• Plus side: Less visible bias

•Minus side: Outsiders have to gain entrée; have less first-hand knowledge of the program 13

Page 14: Evaluation 101 Laura Pejsa Goff Pejsa & Associates MESI 2014.

Scriven's classic terms

14

FORMATIVE evaluation

Conducted during the development or delivery of a programFeedback for program improvement

Page 15: Evaluation 101 Laura Pejsa Goff Pejsa & Associates MESI 2014.

SUMMATIVE evaluation

• Typically done at the end of a

project or project period

•Often done for other users or for accountability purposes

15

Scriven's classic terms

Page 16: Evaluation 101 Laura Pejsa Goff Pejsa & Associates MESI 2014.

16

DEVELOPMENTAL evaluation

•Help develop a program or intervention•Evaluators part of the program design team•Use systematically collected data

A new(er) term from Patton

Page 17: Evaluation 101 Laura Pejsa Goff Pejsa & Associates MESI 2014.

What is the evaluation process?

Every evaluation shares similar procedures

17

Page 18: Evaluation 101 Laura Pejsa Goff Pejsa & Associates MESI 2014.

Patton’s Basics of Evaluation:

•What?•So what?•Now what?

Page 19: Evaluation 101 Laura Pejsa Goff Pejsa & Associates MESI 2014.

19

General Phases of evaluation planningPhase Phase Name Question

I Object description

What are we evaluating?

II Context analysis

1. Why are we doing an evaluation?

2. What do we hope to learn?

III Evaluation plan

How will weHow will we conduct the study?conduct the study?

Page 20: Evaluation 101 Laura Pejsa Goff Pejsa & Associates MESI 2014.

What?

20

•Words?

•Pictures?

The key is understanding…

Page 21: Evaluation 101 Laura Pejsa Goff Pejsa & Associates MESI 2014.

“We build the road, and the road builds us.” -Sri Lankan saying

21

A word about logic models and theories of change…

one way to understand a program.

Page 22: Evaluation 101 Laura Pejsa Goff Pejsa & Associates MESI 2014.

Simplest form of a logic model

22

INPUTS OUTPUTS OUTCOMES

Results-oriented planning

Page 23: Evaluation 101 Laura Pejsa Goff Pejsa & Associates MESI 2014.

23

A bit more detail. . .

INPUTS OUTPUTS OUTCOMES

Program investments

Activities Partici-pation

Short Medium

What we

invest

What we do

Who we

reach

What results?

SO WHAT?

What is the VALUE?

Long-term

Page 24: Evaluation 101 Laura Pejsa Goff Pejsa & Associates MESI 2014.

A simplistic example…

Inputs: Outputs OUTCOMES

Short

Page 25: Evaluation 101 Laura Pejsa Goff Pejsa & Associates MESI 2014.
Page 26: Evaluation 101 Laura Pejsa Goff Pejsa & Associates MESI 2014.

What does a logic model look like?

26

Page 27: Evaluation 101 Laura Pejsa Goff Pejsa & Associates MESI 2014.

Regardless of format, what do logic models and theories of change have in

common?

27

They show activities linked to outcomesThey show relationships/connections that make sense (are logical). Arrows are used to show the connections (the “if-then” relationships)They are (hopefully) understandableThey do not and cannot explain everything about a program!

Page 28: Evaluation 101 Laura Pejsa Goff Pejsa & Associates MESI 2014.

The Case

Page 29: Evaluation 101 Laura Pejsa Goff Pejsa & Associates MESI 2014.

The Case: Logic and/or TheoryDraw a Picture…

•Inputs (what goes in to the program to make it possible?)

•Outputs (Activities: what do they do? Participation: counts)

•Outcomes (what do they think will happen?)

• Short, medium, and long term

Page 30: Evaluation 101 Laura Pejsa Goff Pejsa & Associates MESI 2014.

What can we evaluate?

•Context• Input(s)•Process(es)•Product(s)

Daniel Stufflebeam

Page 31: Evaluation 101 Laura Pejsa Goff Pejsa & Associates MESI 2014.

The basic inquiry tasks (BIT)

1. Framing questions

2. Determining an appropriate design

3. Identifying a sample

4. Collecting data

5. Analyzing data and presenting results

6. Interpreting results

7. “Reporting”

Page 32: Evaluation 101 Laura Pejsa Goff Pejsa & Associates MESI 2014.

Back to the Case: What are our questions?Evaluation Question

#1

#2

#3

Page 33: Evaluation 101 Laura Pejsa Goff Pejsa & Associates MESI 2014.

Back to the Case: What do we need to know, and where can we find it?Evaluation Question

Information Needed

Information Source

#1

#2

#3

Page 34: Evaluation 101 Laura Pejsa Goff Pejsa & Associates MESI 2014.

Possible ways to collect dataQuantitative:

o Surveyso Participant

Assessmentso Cost-benefit

Analysiso Statistical Analysis

of existing program data

o Some kinds of record and document review

Qualitative:

o Focus Groupso Interviewso Observationso Appreciative

inquiryo Some kinds of

record and document review

Page 35: Evaluation 101 Laura Pejsa Goff Pejsa & Associates MESI 2014.

What are the best methods for your evaluation?It all goes back to your question(s)…

• Some data collection methods are better than others at answering your questions

• Some tools are more appropriate for the audience you need to collect information from or report findings to

• Each method of collecting data has its advantages and disadvantages (e.g., cost, availability of information, expertise required)

Page 36: Evaluation 101 Laura Pejsa Goff Pejsa & Associates MESI 2014.

Back to the Case: How will we find out?Evaluation Question

Information Needed

Information Source

Methods

#1

#2

#3

Page 37: Evaluation 101 Laura Pejsa Goff Pejsa & Associates MESI 2014.

Reminder: Importance of Context

Page 38: Evaluation 101 Laura Pejsa Goff Pejsa & Associates MESI 2014.

Desire & Use

• How do we make this process palatable, even desirable?

• What can we do to make information USE more likely?

• Ways of sharing and reporting

Page 39: Evaluation 101 Laura Pejsa Goff Pejsa & Associates MESI 2014.

Debrief & Questions

• What are the most important take-aways from today’s session?

• What can you apply in your own work?

• What questions remain for you?