Too expensive Too complicated Too time consuming.

Post on 12-Jan-2016

222 views 2 download

Tags:

Transcript of Too expensive Too complicated Too time consuming.

&%$#@&%$!!Evaluation is NOT a Dirty

WordKathleen Dowell, Ph.D.

EvalSolutions

Epilepsy Foundation: Best Practices InstituteSeptember 29, 2012

Denver, Colorado

Too expensive

Too complicated

Too time consuming

Not a priority

Just don’t know

where to start

Barriers

Lack of research/statistics skills Lack of time Lack of resources Other priorities Lack of incentive Fear Don’t see value

What is Evaluation?

The process of determining the merit, worth, or value of a program (Scriven, 1991)

What is Evaluation?

Systematic inquiry that describes and explains, policies’ and programs’ operations, effects, justifications, and social implications (Mark, Henry, & Julnes, 2000)

What is Evaluation?

The systematic application of social research procedures for assessing the conceptualization, design, implementation, and utility of social intervention programs (Rossi & Freeman, 1989)

In simpler terms…..

Collection of information to determine the value of a program

eVALUation

Evaluation is NOT….

Auditing Personnel assessment Monitoring (although this can be part

of an evaluation process) Used to end or shut down programs

Evaluation Myth #1

Evaluation is an extraneous activity that generates lots of boring data with useless conclusions

Evaluation Myth #2

Evaluation is about proving the success or failure of a program

Evaluation Myth #3

Evaluation is a unique and complex process that occurs at a certain time in a certain way, and almost always includes the use of outside experts.

How Can Evaluation Help You?

Demonstrate program effectiveness or impacts

How Can Evaluation Help You?

Demonstrate program effectiveness or impacts

Better manage limited resources

How Can Evaluation Help You?

Demonstrate program effectiveness or impacts

Better manage limited resources Document program accomplishments

How Can Evaluation Help You?

Demonstrate program effectiveness or impacts

Better manage limited resources Document program accomplishments Justify current program funding

How Can Evaluation Help You?

Demonstrate program effectiveness or impacts

Better manage limited resources Document program accomplishments Justify current program funding Support need for increased funding

How Can Evaluation Help You?

Demonstrate program effectiveness or impacts

Better manage limited resources Document program accomplishments Justify current program funding Support need for increased funding Satisfy ethical responsibility to clients to

demonstrate positive and negative effects of participation

How Can Evaluation Help You?

Demonstrate program effectiveness or impacts

Better manage limited resources Document program accomplishments Justify current program funding Support need for increased funding Satisfy ethical responsibility to clients to

demonstrate positive and negative effects of participation

Document program development and activities to help ensure successful replication

Ultimately…

To improve program performance which leads to better value for your

resources

No Evaluation Means…. No evidence that your program is working

or how it works

No Evaluation Means…. No evidence that your program is working

or how it works Lack of justification for new or increased

funding

No Evaluation Means…. No evidence that your program is working

or how it works Lack of justification for new or increased

funding No marketing power for potential clients

No Evaluation Means…. No evidence that your program is working

or how it works Lack of justification for new or increased

funding No marketing power for potential clients Lack of credibility

No Evaluation Means…. No evidence that your program is working

or how it works Lack of justification for new or increased

funding No marketing power for potential clients Lack of credibility Lack of political and/or social support

No Evaluation Means…. No evidence that your program is working

or how it works Lack of justification for new or increased

funding No marketing power for potential clients Lack of credibility Lack of political and/or social support No way to know how to improve

Program Life Cycle

Development

Implementation

Evaluation

Revision Sustainability

Basic Terminology

Types of Evaluation› Outcome (summative)› Process (formative)

Basic Terminology

Types of Evaluation› Outcome (summative)› Process (formative)

Outcomes

Basic Terminology

Types of Evaluation› Outcome (summative)› Process (formative)

Outcomes Indicators

Basic Terminology

Types of Evaluation› Outcome (summative)› Process (formative)

Outcomes Indicators Measures

Basic Terminology

Types of Evaluation› Outcome (summative)› Process (formative)

Outcomes Indicators Measures Benchmarks

Basic Terminology

Types of Evaluation› Outcome (summative)› Process (formative)

Outcomes Indicators Measures Benchmarks Quantitative vs. qualitative

Evaluation Process

Engage stakeholders

Clearly define program

Written evaluation

planCollect

credible/useful data

Analyze data

Share/use results

Engage Stakeholders

Those involved in program design, delivery, and/or funding

Those served by the program Users of the evaluation results

Clearly Define Program

Resources, activities, outcomes Context in which program operates Logic model

› Explicit connections between “how” and “what”

› Helps with program improvement› Good for sharing program idea with others› Living, breathing model

IF THEN

IF THENI take an

aspirin

IF THENI take an

aspirin

My headache will go away

IF = Inputs & ActivitiesTHEN = Outcomes

Written Evaluation Plan

Outcomes Indicators Tools Timelines Person(s) responsible (optional)

Sample Evaluation Plan

PROGRAM OUTCOME

INDICATOR(S)DATA

COLLECTION TOOL

DATA COLLECTION SCHEDULE

Training participants know how to recognize a seizure

Percent of training participants who correctly identify 10 out of 13 possible symptoms of a seizure

Participant pre, post and follow-up surveys

Pre survey given prior to training; post survey given immediately after training; follow up survey given 30 days after training

Credible Data Collection Tools

Valid and reliable tools› Valid=measures what it is intended to

measure› Reliable=consistent results over time

Qualitative Quantitative Will answer your evaluation questions

and inform decision-making

Collect Credible/Useful Data

Quantitative› Surveys› Tests› Skill assessments

Qualitative› Focus groups› Interviews› Journals› Observations

Analyze Data

Many methods Answer evaluation questions Engage stakeholders in interpretations Justify conclusions and

recommendations Get help if needed!

Share/Use Results

Reporting format Getting results into the right hands Framing the results Collaborative vs. confrontational

approach Keeping users “in the loop” Debriefs and follow-up

Considerations

Purpose

Considerations

Purpose Audience

Considerations

Purpose Audience Resources

Considerations

Purpose Audience Resources Data

Considerations

Purpose Audience Resources Data Timeline

Considerations

Purpose Audience Resources Data Timeline Planning is key

Considerations

Purpose Audience Resources Data Timeline Planning is key Expertise

Contracting out…

Staff to perform work

Contracting out…

Staff to perform work› Available

Contracting out…

Staff to perform work

› Expertise› Available

Contracting out…

Staff to perform work

› Expertise› Available

Credibility

Contracting out…

Staff to perform work

› Expertise› Available

Credibility Technological support

Contracting out…

Staff to perform work

› Expertise› Available

Credibility Technological support

› Collect data

Contracting out…

Staff to perform work

› Expertise› Available

Credibility Technological support

› Collect data› Analyze data

Contracting out…

Staff to perform work

› Expertise› Available

Credibility Technological support

› Collect data› Analyze data

Time frame

Seniors and Seizures Evaluation

Training program for caretakers of seniors with epilepsy/seizures

ADC staff and primary care providers Training provided by affiliates Delivery varies but content is

consistent

Process

Meeting with EF staff to learn about the program

Collaboration with affiliate staff to design logic model

Decisions regarding which outcomes to measure

Decisions regarding how to best collect data

Designed data collection tools Pilot testing and revision

Evaluation Questions

What impact did the training program have on knowledge of seizures in seniors?› Pre and post knowledge assessment› Post-training survey

What impact did the training program have on participants’ confidence and comfort in working with seniors ?› Post-training survey

Knowledge

Pre Training Post Training0

102030405060708090

100

54

88

Increased Knowledge After “Seniors and Seizures” Training Program (N=17)

Percentage of test ques-

tions an-swered cor-

rectly

Care of Seniors

Confidence to use material

Quality of care

1.0 2.0 3.0 4.0 5.0 6.0 7.0 8.0 9.0 10.0

8

7.9

Participant Ratings of Training Impact

1=no impact, 10=extensive impact

1=not at all confident, 10=completely confident

Our benchmark is a rating of 7.0 or higher

For More Information….

Kathleen Dowell, Ph.D., PresidentEvalSolutions

6408 Whistling Wind WayMt. Airy, MD 21771

410-707-0763kathy@eval-solutions.comwww.eval-solutions.com