Post on 16-Jan-2016
Measuring QI Intervention Implementation: Helping the Blind Men See?
EQUIP (Evidence-Based Practice in Schizophrenia )
QUERI National Meeting
Working Group
December 12, 2008
QI Intervention Example
EQUIP (Enhancing QUality of care In Psychosis)– evidence-based quality improvement to implement
effective care in specialty mental health– Alex Young, MD & Amy Cohen, PhD (Co-PIs)
EQUIP Effective Schizophrenia Care
EBQI
Provider/patient education
Quality manager
QI Informatics support
Performance feedback
Leadership support
“infrastructure”“priority-setting”
Evidence base: • TMAP• EQUIP-1
Context Matters: Design for It
EQUIP– 4 VISNs: intervention and control site in each VISN– sites chosen collaboratively based on interest– Each VISN asked to select evidence-based care targets for
intervention: all selected Wellness & Supported Employment– Availability & quality of these care targets vary across sites– Structure of care for patients with schizophrenia varies across
sites– Formative evaluation methods utilized to understand variable
implementation
Formative evaluation=assessment process designed to
identify potential and actual influences on the progress
and effectiveness of implementation efforts
Data collection occurs before, during, and after
implementation
Need to be able to answer questions about context,
adaptations, and responses to change
What is Formative Evaluation?
Developmental evaluation
Implementation-focused evaluation (process
evaluation)
Progress-focused evaluation
Interpretive evaluation
Four Stages of Formative Evaluation
Simpson Transfer Model
Developmental
• Field notes
•Documents (minutes, etc.)
•ORC & Burnout Inventory
•Key stakeholder interviews
Implementation-Focused
• Field notes
•Quality Coordinator logs
•Documents
•Key stakeholder interviews
Progress-Focused
• QI tools
Interpretive
• Field notes
•Key stakeholder interviews
•ORC & Burnout Inventory
Stages of FE (STM) & EQUIP FE Measures
Post-Implementation
(STM: Practice)
Pre-Implementation
(STM: Exposure & Adoption)
Implementation
(STM: Implementation)
Multiple Data Sources: Measuring Implementation
EQUIP ExamplesSemi-structured interviews:leaders, clinicians, mgrs participation, level of
implementation
Organizational site surveys:admin. & staff
clinic structure, processes, change
Field journals group-level dynamics, implementation details
Administrative data visits, Rxs
Patient surveys PAS
Activity logs Time spent on aspects of study
Multiple Data Sources:Strengths and Challenges
Strengths Challenges
Semi-structured interviews:leaders, clinicians, mgrs
rich data, diverse perspectives
expensive, time-consuming
Organizational site surveys:Admin & staff
site profiles, faster, easier to analyze
limited discovery, key informant view
Field journals detailed contextual data
variation between observers
Administrative data readily available, historical value
not QII-specific, local coding differences
Patient surveys validate experience, exposure, outcomes
expensive, highly sensitive to sample
Activity logs clinical implementation, dose of effort/time
global measure—no detailed dose info.
Organizational Readiness for Change (ORC): Staff and
Administrator versions
Maslach Burnout Inventory
On-line measure
Pre- and post-implementation
EQUIP Organizational Climate Measures
Using scales related to:
– Motivation for change (program needs, training needs,
pressures for change)
– Staff attributes (growth, adaptability)
– Organizational climate (mission, cohesion, autonomy,
communication, change)
Purpose is descriptive & to assess change in readiness
from pre- to post-implementation
Organizational Readiness for Change
Conducted pre-, mid-, and post-implementation
Versions for providers, administrators, and VISN leaders
Covered in consent
Face-to-face recorded interviews
Professionally transcribed
Analyzed after each round
EQUIP Semi-Structured Interviews
Primary method of capturing data from observant
participation
“If you didn’t write it down in your field notes, then it didn’t
happen.” (at least in terms of data analysis)
3 kinds of notes
– Records of events observed and information given
– Records of prolonged activities
– Chronological daily diary
EQUIP Participant Observation: Field Journal
Submitted monthly by RN Quality Coordinator
What % of time was spent on each aspect of clinical
intervention
Will be able to look across sites to see variation in time
spent on clinical activities; can see if this relates
qualitatively to implementation at each site
EQUIP Quality Coordinator Logs
Critical Measures of Implementation
Integrity of innovation– Fidelity to planned implementation strategy– Dose of intervention delivery, when variability is possible – Requires clear operational definitions of intervention components
Exposure to innovation– Degree to which intervention is experienced by targeted users– Dose of exposure, when variability is possible– Requires clear operational defs for measuring intervention exposure
Intensity of implementation– E.g, implementation or intensity scores for multifaceted interventions– Eg, ‘goal attainment scaling’ when strategy allows local adaptation or
choice of alternative interventions across sites
Triangulation
Critical to collect information about implementation from multiple sources– Be prepared for disagreement– Perspectives and opportunities for observation differ
for managers, providers vs. patients
Recognize differences between “exposed” sample and practice population– Does the “enrolled” group represent the practice?– Did the intervention penetrate among all providers?
Telling the story of variable implementation
Examine range of data sources as a team– Throughout course of data collection– Discuss which data sources answer which
questions
Examine which data sources are complementary – Which data sources should be triangulated?– What questions are raised or what answers are
provided?
Telling the story of variable implementation
Use qualitative data analysis software to facilitate mixed methods analysis– Multiple data sources– Multiple grouping options (e.g., by site, by
stakeholder, by data collection time points)– Team-based analysis– Ongoing, iterative analysis informing implementation
efforts
Software Support: ATLAS.ti
Telling the story of variable implementation
Audience considerations– Throughout course of data collection– Which data sources answer which questions, for
whom– Issue of providing feedback to sites
Product considerations – Which data sources should be triangulated?– What questions are raised or what answers are
provided?– How much and what should go into which products?