&%$#@&%$!!

72
&%$#@&%$!! Evaluation is NOT a Dirty Word Kathleen Dowell, Ph.D. EvalSolutions Epilepsy Foundation: Best Practices Institute September 29, 2012 Denver, Colorado

description

&%$#@&%$!!. Evaluation is NOT a Dirty Word Kathleen Dowell, Ph.D. EvalSolutions Epilepsy Foundation: Best Practices Institute September 29, 2012 Denver, Colorado. Too expensive. Too complicated. Too time consuming. Not a priority. Just don’t know where to start. Barriers. - PowerPoint PPT Presentation

Transcript of &%$#@&%$!!

Page 1: &%$#@&%$!!

&%$#@&%$!!Evaluation is NOT a Dirty

WordKathleen Dowell, Ph.D.

EvalSolutions

Epilepsy Foundation: Best Practices InstituteSeptember 29, 2012

Denver, Colorado

Page 2: &%$#@&%$!!

Too expensive

Page 3: &%$#@&%$!!

Too complicated

Page 4: &%$#@&%$!!

Too time consuming

Page 5: &%$#@&%$!!

Not a priority

Page 6: &%$#@&%$!!

Just don’t know

where to start

Page 7: &%$#@&%$!!

Barriers Lack of research/statistics skills Lack of time Lack of resources Other priorities Lack of incentive Fear Don’t see value

Page 8: &%$#@&%$!!

What is Evaluation?

The process of determining the merit, worth, or value of a program (Scriven, 1991)

Page 9: &%$#@&%$!!

What is Evaluation?

Systematic inquiry that describes and explains, policies’ and programs’ operations, effects, justifications, and social implications (Mark, Henry, & Julnes, 2000)

Page 10: &%$#@&%$!!

What is Evaluation?

The systematic application of social research procedures for assessing the conceptualization, design, implementation, and utility of social intervention programs (Rossi & Freeman, 1989)

Page 11: &%$#@&%$!!

In simpler terms…..

Collection of information to determine the value of a program

eVALUation

Page 12: &%$#@&%$!!

Evaluation is NOT…. Auditing Personnel assessment Monitoring (although this can be part

of an evaluation process) Used to end or shut down programs

Page 13: &%$#@&%$!!

Evaluation Myth #1

Evaluation is an extraneous activity that generates lots of boring data with useless conclusions

Page 14: &%$#@&%$!!

Evaluation Myth #2

Evaluation is about proving the success or failure of a program

Page 15: &%$#@&%$!!

Evaluation Myth #3

Evaluation is a unique and complex process that occurs at a certain time in a certain way, and almost always includes the use of outside experts.

Page 16: &%$#@&%$!!

How Can Evaluation Help You?

Demonstrate program effectiveness or impacts

Page 17: &%$#@&%$!!

How Can Evaluation Help You?

Demonstrate program effectiveness or impacts

Better manage limited resources

Page 18: &%$#@&%$!!

How Can Evaluation Help You?

Demonstrate program effectiveness or impacts

Better manage limited resources Document program accomplishments

Page 19: &%$#@&%$!!

How Can Evaluation Help You?

Demonstrate program effectiveness or impacts

Better manage limited resources Document program accomplishments Justify current program funding

Page 20: &%$#@&%$!!

How Can Evaluation Help You?

Demonstrate program effectiveness or impacts

Better manage limited resources Document program accomplishments Justify current program funding Support need for increased funding

Page 21: &%$#@&%$!!

How Can Evaluation Help You?

Demonstrate program effectiveness or impacts

Better manage limited resources Document program accomplishments Justify current program funding Support need for increased funding Satisfy ethical responsibility to clients to

demonstrate positive and negative effects of participation

Page 22: &%$#@&%$!!

How Can Evaluation Help You?

Demonstrate program effectiveness or impacts

Better manage limited resources Document program accomplishments Justify current program funding Support need for increased funding Satisfy ethical responsibility to clients to

demonstrate positive and negative effects of participation

Document program development and activities to help ensure successful replication

Page 23: &%$#@&%$!!

Ultimately…

To improve program performance which leads to better value for your

resources

Page 24: &%$#@&%$!!

No Evaluation Means…. No evidence that your program is working

or how it works

Page 25: &%$#@&%$!!

No Evaluation Means…. No evidence that your program is working

or how it works Lack of justification for new or increased

funding

Page 26: &%$#@&%$!!

No Evaluation Means…. No evidence that your program is working

or how it works Lack of justification for new or increased

funding No marketing power for potential clients

Page 27: &%$#@&%$!!

No Evaluation Means…. No evidence that your program is working

or how it works Lack of justification for new or increased

funding No marketing power for potential clients Lack of credibility

Page 28: &%$#@&%$!!

No Evaluation Means…. No evidence that your program is working

or how it works Lack of justification for new or increased

funding No marketing power for potential clients Lack of credibility Lack of political and/or social support

Page 29: &%$#@&%$!!

No Evaluation Means…. No evidence that your program is working

or how it works Lack of justification for new or increased

funding No marketing power for potential clients Lack of credibility Lack of political and/or social support No way to know how to improve

Page 30: &%$#@&%$!!

Program Life Cycle

Development

Implementation

Evaluation

Revision Sustainability

Page 31: &%$#@&%$!!

Basic Terminology Types of Evaluation

› Outcome (summative)› Process (formative)

Page 32: &%$#@&%$!!

Basic Terminology Types of Evaluation

› Outcome (summative)› Process (formative)

Outcomes

Page 33: &%$#@&%$!!

Basic Terminology Types of Evaluation

› Outcome (summative)› Process (formative)

Outcomes Indicators

Page 34: &%$#@&%$!!

Basic Terminology Types of Evaluation

› Outcome (summative)› Process (formative)

Outcomes Indicators Measures

Page 35: &%$#@&%$!!

Basic Terminology Types of Evaluation

› Outcome (summative)› Process (formative)

Outcomes Indicators Measures Benchmarks

Page 36: &%$#@&%$!!

Basic Terminology Types of Evaluation

› Outcome (summative)› Process (formative)

Outcomes Indicators Measures Benchmarks Quantitative vs. qualitative

Page 37: &%$#@&%$!!

Evaluation Process

Engage stakeholders

Clearly define program

Written evaluation

planCollect

credible/useful data

Analyze data

Share/use results

Page 38: &%$#@&%$!!

Engage Stakeholders Those involved in program design,

delivery, and/or funding Those served by the program Users of the evaluation results

Page 39: &%$#@&%$!!

Clearly Define Program Resources, activities, outcomes Context in which program operates Logic model

› Explicit connections between “how” and “what”

› Helps with program improvement› Good for sharing program idea with others› Living, breathing model

Page 40: &%$#@&%$!!

IF THEN

Page 41: &%$#@&%$!!

IF THENI take an

aspirin

Page 42: &%$#@&%$!!

IF THENI take an

aspirinMy headache will go away

Page 43: &%$#@&%$!!

IF = Inputs & ActivitiesTHEN = Outcomes

Page 44: &%$#@&%$!!
Page 45: &%$#@&%$!!

Written Evaluation Plan Outcomes Indicators Tools Timelines Person(s) responsible (optional)

Page 46: &%$#@&%$!!

Sample Evaluation PlanPROGRAM OUTCOME INDICATOR(S)

DATA COLLECTIO

N TOOL

DATA COLLECTION SCHEDULE

Training participants know how to recognize a seizure

Percent of training participants who correctly identify 10 out of 13 possible symptoms of a seizure

Participant pre, post and follow-up surveys

Pre survey given prior to training; post survey given immediately after training; follow up survey given 30 days after training

Page 47: &%$#@&%$!!

Credible Data Collection Tools

Valid and reliable tools› Valid=measures what it is intended to

measure› Reliable=consistent results over time

Qualitative Quantitative Will answer your evaluation questions

and inform decision-making

Page 48: &%$#@&%$!!

Collect Credible/Useful Data Quantitative

› Surveys› Tests› Skill assessments

Qualitative› Focus groups› Interviews› Journals› Observations

Page 49: &%$#@&%$!!

Analyze Data Many methods Answer evaluation questions Engage stakeholders in interpretations Justify conclusions and

recommendations Get help if needed!

Page 50: &%$#@&%$!!

Share/Use Results Reporting format Getting results into the right hands Framing the results Collaborative vs. confrontational

approach Keeping users “in the loop” Debriefs and follow-up

Page 51: &%$#@&%$!!

Considerations Purpose

Page 52: &%$#@&%$!!

Considerations Purpose Audience

Page 53: &%$#@&%$!!

Considerations Purpose Audience Resources

Page 54: &%$#@&%$!!

Considerations Purpose Audience Resources Data

Page 55: &%$#@&%$!!

Considerations Purpose Audience Resources Data Timeline

Page 56: &%$#@&%$!!

Considerations Purpose Audience Resources Data Timeline Planning is key

Page 57: &%$#@&%$!!

Considerations Purpose Audience Resources Data Timeline Planning is key Expertise

Page 58: &%$#@&%$!!

Contracting out… Staff to perform work

Page 59: &%$#@&%$!!

Contracting out… Staff to perform work

› Available

Page 60: &%$#@&%$!!

Contracting out… Staff to perform work

› Expertise› Available

Page 61: &%$#@&%$!!

Contracting out… Staff to perform work

› Expertise› Available

Credibility

Page 62: &%$#@&%$!!

Contracting out… Staff to perform work

› Expertise› Available

Credibility Technological support

Page 63: &%$#@&%$!!

Contracting out… Staff to perform work

› Expertise› Available

Credibility Technological support

› Collect data

Page 64: &%$#@&%$!!

Contracting out… Staff to perform work

› Expertise› Available

Credibility Technological support

› Collect data› Analyze data

Page 65: &%$#@&%$!!

Contracting out… Staff to perform work

› Expertise› Available

Credibility Technological support

› Collect data› Analyze data

Time frame

Page 66: &%$#@&%$!!

Seniors and Seizures Evaluation

Training program for caretakers of seniors with epilepsy/seizures

ADC staff and primary care providers Training provided by affiliates Delivery varies but content is

consistent

Page 67: &%$#@&%$!!

Process Meeting with EF staff to learn about the

program Collaboration with affiliate staff to design

logic model Decisions regarding which outcomes to

measure Decisions regarding how to best collect

data Designed data collection tools Pilot testing and revision

Page 68: &%$#@&%$!!
Page 69: &%$#@&%$!!

Evaluation Questions What impact did the training program

have on knowledge of seizures in seniors?› Pre and post knowledge assessment› Post-training survey

What impact did the training program have on participants’ confidence and comfort in working with seniors ?› Post-training survey

Page 70: &%$#@&%$!!

Knowledge

Pre Training Post Training0

102030405060708090

100

54

88

Increased Knowledge After “Seniors and Seizures” Training Program (N=17)

Percentage of test ques-

tions an-swered cor-

rectly

Page 71: &%$#@&%$!!

Care of Seniors

Confidence to use material

Quality of care

1.0 2.0 3.0 4.0 5.0 6.0 7.0 8.0 9.0 10.0

8

7.9

Participant Ratings of Training Impact

1=no impact, 10=extensive impact

1=not at all confident, 10=completely confident

Our benchmark is a rating of 7.0 or higher

Page 72: &%$#@&%$!!

For More Information….Kathleen Dowell, Ph.D., President

EvalSolutions6408 Whistling Wind Way

Mt. Airy, MD 21771

[email protected]