Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

46
Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing

Transcript of Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

Page 1: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

Program Evaluation

How Do I Show This Works?

Paul F. Cook, PhDUCDHSC School of Nursing

Page 2: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

Why Evaluate?

• We have to (state, federal, or contract reg.s)

• In order to compete (JCAHO, NCQA, URAC)• It helps us manage staff• It helps us manage programs• It helps us maintain high-quality programs• It helps us develop even better programs

Page 3: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

Targets for Evaluation

• Services (ongoing quality of services delivered)

• Systems (service settings or workflows)• Programs (special projects or initiatives)

– “horse races” to decide how to use limited resources

– cost/benefit analysis may be included

• People (quality of services by individuals)– provider or site “report cards”– clinical practice guideline audits– supervisor evaluation of individuals

Grembowski. (2001). The Practice of Health Program Evaluation

Page 4: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

Basic: One-Time Evaluation

• Special projects• Grants• Pilot programs

May have a control group or use pre-post design

Opportunity for better-designed research

Finish it and you’re done

Page 5: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

• Measure (access, best practices, patient satisfaction, provider satisfaction, clinical outcomes)

• Identify Barriers• Make Improvements• Re-Measure• Etc.

Measure

IdentifyBarriers

Improve

Intermediate: Quality Improvement

Page 6: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

Advanced: “Management by Data”

• Data “dashboards”• Real-time monitoring of important indicators• Requires automatic data capture (no manual

entry) and reporting software – sophisticated IT

If you want to try this at home: – SQL database (or start small with MS Access)– Crystal Reports report templates– Crystal Enterprise software to automate reporting

Page 7: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

Worksheet for a QI Project

• Name (action word: e.g., “Improving x …”)• Needs assessment

– Target population– Identified need

• Performance measures– Baseline data– Timeframe for remeasurement

• Benchmark and/or goal• Barriers and opportunities• Strong and targeted actions• Remeasurement and next steps

Page 8: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

Planning for an Evaluation

Page 9: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

“You can accomplish anything in life, provided that you do not mind who gets the credit.”

—Harry S. Truman

Page 10: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

Start with Stakeholders

• Even if you know the right problem to fix, someone needs to buy in – who are they?– Coworkers– Management– Administration– Consumer advocates– Community organizations (CBPR)– The healthcare marketplace

• Strategy: sell your idea at several levels• To Succeed: focus on each group’s needs

Fisher, Ury, & Patton. (2003). Getting to Yes

Page 11: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

Needs Assessment(Formative Evaluation)

• Use data, if you have them• Describe current environment,

current needs or goals, past efforts & results

• Various methods:– Administrative services data– Administrative cost data– Administrative clinical data (e.g., EMR)– Chart review data (a small sample is OK)– Survey data (a small sample is OK)– Epidemiology data or published literature

Page 12: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

Common Rationales for a QIP

• High Risk

• High Cost

• High Volume

• Need for Prevention

Page 13: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

Program Design

• Theoretical basis for the program• Resources needed

– Time– People– Money/equipment– Space

• Concrete steps for implementing the program– Manuals– Software– Tools/supplies– Training– Ongoing supervision

Page 14: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

How Implementation May Fail

• Lack of fidelity to theory• Providers not adequately trained• Treatment not implemented as designed• Participants didn’t participate• Participants didn’t receive “active

ingredients”• Participants didn’t enact new skills• Results didn’t generalize across time,

situations

Bellg, et al. (2004). Health Psychology, 23(5), 443-451

Page 15: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

Selling Innovations

It may help to emphasize:• Relative advantage of the change• Compatibility with the current system• Simplicity of the change and of the

transition• Testability of the results• Observability of the improvement

Rogers. (1995). The Diffusion of Innovations.

Page 16: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

Performance Measuresand Baseline Data

Page 17: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

“The Commanding General is well aware that the forecasts are no good. However, he needs them for planning purposes.”

— Nobel laureate economist Kenneth Arrow, quoting from his time as an Air Force weather forecaster

Page 18: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

Asking the Question• Ask the right question

– Innovation– Process (many levels)– Outcome– Impact– Capacity-Building/Sustainability

• Try to answer only one question– Focus on the data you must have at the end – Consider other stakeholders’ interests– Collect data on side issues as you can– Attend to “respondent burden” concerns

Page 19: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

• Reliability (data are not showing random error)

• Validity (measuring the right construct)

• Responsiveness (ability to detect change)

• Acceptability (usefulness in practice)

Existing measures may not work in all settings;Unvalidated measures may not tell you anything

Standard Issues in Measurement

Page 20: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

Data Sources

Answers

Patient SatisfactionSurveys

EncounterData

Chart ReviewData

ProviderSurveys

FinancialData

Qualitative/Focus Group

Data

Patient OutcomeSurveys

Interview Data

Safety MonitoringData

Page 21: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

No Perfect Method

• Patient survey– Recall bias– Social desirability bias– Response bias

• Observer ratings– Inter-rater variability– Availability heuristic (“clinician’s illusion”)– Fundamental attribution error

• Administrative data– Collected for other purposes– Gaps in coverage, care, eligibility, etc.

Goal: measures that are “objective, quantifiable, and based on current scientific knowledge”

Page 22: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

CNR “Instrument Lab” at www.uchsc.edu/nursing/cnr

Page 23: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

Benchmarking

• Industry standards (e.g., JCAHO)

• Peer organizations

• “Normative data” for specific instruments

• Research literature– Systematic reviews– Meta-analyses– Individual articles in peer-reviewed journals

Page 24: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

Setting Goals

• An average or a percent is OK

• Set an absolute goal: “improve to 50%” vs “improve by 10% over baseline”– “improve by x percentage points”, not– “improve by x percent” (this depends on base

rate)

• Set an achievable goal: if you’re at 40%, don’t make it 90%

• Set a ‘stretch goal’ - not too easy

• ‘Zero performance defects’ (100%) is rarely helpful – use 95% or 99% performance instead

Page 25: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

Special Issues in Setting Goals

• Absolute number may work as a goal in some scenarios – e.g., # of rural health consults/yr – but percents or averages allow statistical significance testing

• Improvement from zero to something is usually not seen as improvement (“so your program is new; but what good did it do?”)

• Don’t convert a scale to a percent (i.e., don’t use a cut-off point) unless you absolutely must

Owen & Froman (2005). RINAH, 28, 496-503

Page 26: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

Specifying the “Denominator”

• Level of analysis– Unit– Provider– Patient– For some populations, family

• A subgroup, or the entire population?– Primary (entire population)– Secondary (at-risk population)– Tertiary (identified patients)– Subgroups of patients (e.g., CHD with

complications)

Page 27: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

Baseline Data

• May be pre-existing– Charts– Administrative data

• May need to collect data prior to starting– Surveys

• Whatever you do for baseline, you will need to do it exactly the same way for remeasurement– Same type of informant (e.g., providers)– Same instruments (e.g., chart audit tool)– Same method of administration (e.g., by phone)

Page 28: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

Sampling

• Representativeness– Random sample, stratified sample, quota

sample– Characteristics of volunteers– Underrepresented groups– Effect of survey method (phone, Internet)

• Response Rate– 10% is good for industry– If you have 100% of the data available, use

100%

• Finding the right sample size: http://www.surveysystem.com/sscalc.htm

Page 29: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

Evaluation Frequency and Duration

• Seasonal trends (selection bias)

• Confounding factors– Organizational change (history)– Outside events (history)– Other changes in the organization (maturation)– Change in patient case mix (selection bias)

• For the same subjects over time (pre/post):– Notice the shrinking denominator (attrition)

• If your subjects know they are being evaluated:– Don’t evaluate too often (testing)– Don’t evaluate too rarely (reactivity)

Page 30: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

Evaluation Design

Page 31: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

Snow White and the 7 Threats to Validity

• History – external events• Maturation – mere passage of time• Testing – observation changes the results• Instrumentation – random noise on the

radar• Mortality/Attrition – data lost to follow-up• Selection Bias – not a representative

sample• Reactivity – placebo (Hawthorne) effects

Grace. (1996). http://www.son.rochester.edu/son/research/research-fables.

Page 32: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

Study DesignEffect of

Effect Timeof Group

Posttest Only

Pretest and Posttest

Pretest, Posttest, and Follow-up

No Comparison Group

Posttest project results

Pretest-posttest change

Longitudinal project results

Nonrandom Comparison Group

Posttest results vs control

Pre-post change vs control

Longitudinal results vs control

Randomized Control Group

Pilot RCT Full RCT Longitudinal RCT 123

456

789

Adapted from Bamberger et al. (2006). RealWorld Evaluation

Page 33: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

Post Hoc Evaluation

• Can you get posttest for the intervention?– Design 9

• Can you get baseline for the intervention?– Design 8 or 7

• Can you get posttest for a comparison group?– Design 6

• Can you get baseline for the comparison group?– Design 5 or 4– (Randomization requires a prospective design)

Page 34: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

Cost-Effectiveness Evaluation• From “does it work” to “can we afford it?”

• Methods for cost-effectiveness evaluation:– Cost-offset: does it save more than it spends?– Cost-benefit: do the benefits produced

(measured in $ terms – e.g., QALYs, 1/$50K) exceed the $ costs?

– Cost-effectiveness: do the health benefits produced (measured as clinical outcomes – e.g. reduced risk based on odds ratios) justify the $ costs?

– Cost-utility: do the health benefits produced (measured based on consumer preferences) justify the $ costs? (e.g., based on willingness to pay)

Kaplan & Groessl. (2002). J Consult Clin Psych, 70(3), 482-493

Page 35: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

Statistics

• Descriptive statistics: “how big?”– Averages & Standard Deviations– Correlations– Odds Ratios

• Inferential statistics: “how likely?”– Various tests (t, F, chi-square)– Correct test to use depends on the type of data– All give you a p-value (chance the result is random)– “Significant” or not is highly dependent on sample

N

Page 36: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

Actions for Improvement

Page 37: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

Evaluating Actions

• Strong– Designed to address the barriers identified– Consistent with past experience/research

literature– Seem likely to have an impact– Implemented effectively & consistently

• Targeted– Right time– Right place– Right people– Have an impact on the barriers identified

NCQA. (2003). Standards and Guidelines for the Accreditation of MBHOs.

Page 38: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

Theory-Based Actions

• What is the problem? (descriptive)• What causes the problem? (problem theory)

– People– Processes– Unmet needs

• How to solve the problem? (theory of change)– Educate– Coach or Train– Communicate or build linkages– Redesign existing systems or services– Design new systems or services– Use new technologies

Page 39: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

Using Evaluation Results

Page 40: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

Describe the Process

• Needs analysis and stakeholder input• Identification of barriers• Theory basis for the intervention

– What actions were considered?– Why were these actions chosen?– How did these actions address the identified barriers?

• Implementation– What was done?– Who did it?– How were they monitored, supervised, etc.?– For how long, in what amount, in what way was it done?

• Data collection– What measures were used?– How were the data collected, and by whom?

Page 41: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

Describe the Results(Summative Evaluation)

• What were the outcomes?– Data on the primary outcome measure– Compare to baseline (if available)– Compare to goal– Compare to benchmark– Provide data on any secondary measures that

also support your conclusions about program outcomes

• What else did you find out?– Answers to any additional questions that came up– Any other interesting findings (lessons learned)

• Show a graph of the results

Page 42: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

“Getting information from a table is like extracting sunlight from a cucumber.”

—Wainer & Thissen, 1981

Page 43: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

Conclusions

• If the goals were met:– What key barriers were targeted?– What was the most effective action, and why?

• If the goals were not met:– Did you miss some key barriers to improvement?– Was the idea good, but there were barriers to

implementation that you didn’t anticipate? What were they, and how could they be overcome?

– Did you get only part way there (e.g., change in knowledge but not change in behavior)?

– Did the intervention produce results on other important outcomes instead?

Page 44: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

Dissemination

• Back to the original stakeholder groups• Remind them – needs, goals, and actions• Address additional questions or concerns

– “That’s a good suggestion; we could try it going forward, and see whether it helps”

– “We did try that, and here’s what we found”– “We didn’t have time/money/experience to do

that, but we can explore it for the future”– “We didn’t think of that question, but we do

have some data that might answer it”– “We don’t have data to answer that question,

but it’s a good idea for future study”

Page 45: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

Broader Dissemination• Organizational newsletter• Summaries for patients, providers, payors• Trade association conference or publication• Scholarly research conference presentation

– Rocky Mountain EBP Conference– WIN Conference

• Scholarly journal article– Where to publish depends on rigor of the design– Look at journal “impact factor” (higher = broader

reach, but also more selective)

• Popular press

Page 46: Program Evaluation How Do I Show This Works? Paul F. Cook, PhD UCDHSC School of Nursing.

Next Steps

• PDSA model: after “plan-do-study,” the final step is “act” – roll out the program as widely as possible to obtain all possible benefits

• Use lessons learned in this project as the needs analysis for your next improvement activity

• Apply what you’ve learned about success in this area to design interventions in other areas

• Set higher goals, and design additional actions to address the same problem even better