Developing the Dashboard

31
1 Developing a Dashboard Measurement and Evaluation: Understanding the Impact of Innovation October 29, 2013

Transcript of Developing the Dashboard

Page 1: Developing the Dashboard

1

Developing a Dashboard Measurement and Evaluation:

Understanding the Impact of Innovation

October 29, 2013

Page 2: Developing the Dashboard

2

Objectives

Describe dashboard development process in the context of MGH Innovation Unit work.

Identify barriers and facilitators in developing, implementing, and sustaining the Innovation Unit Dashboard performance measurement tool.

Page 3: Developing the Dashboard

3

Agenda

Overview of Innovation Units Empirical Outcomes Dashboard Development Rationale Innovation Unit Dashboard:

Metric selection Data sources & relevant benchmarks

Using data to tell stories Future considerations

Page 4: Developing the Dashboard

Positioning MGH for The Future

Technology Application: Partners E-Care, Outcomes Registries

ThePatientJourney

Direct Patient Care:

ED, Periop, Inpatient

(Innovation Units)

Population Manageme

nt:Reducing the

Trend of Healthcare

Costs, Long-term

Outpatient Care

Patient Affordability For MGH & Payers:

Overhead (Non-Labor

costs)

Care Redesign: Multidisciplina

ry Services, Large Patient

Population, Big $$$

Incentives: Intrinsic and Extrinsic

Page 5: Developing the Dashboard

5

Innovating Care at MGH

Innovation Units are tests of change that will help us quickly identify what works and what does not work to improve the quality of care delivered to our patients.

High performing interdisciplinary teams that deliver safe, effective, efficient, timely, equitable care, that is patient- and family-centered

Standardization of processes and care reduces variation and introduces a systematic approach to improving quality and safety in the inpatient setting

Identify and prioritize hazards and opportunities for standardization, then implement evidence based methods to rectify the problem

We are attempting transformational change.We are attempting transformational change.

Page 6: Developing the Dashboard

6

Three Key Areas of Focus and Four Desired Outcomes

Focus

1. New Culture through Relationship Based Care

2. New Role of Attending Nurse; Domains of Practice

3. Standardized Processes

Throughput and LOS Reduction

Technology

Controlling Variation

Implementing Evidence Based Practice

Outcomes

1. Patient Satisfaction: care is equitable and patient- and family-focused

2. Clinical Quality: to improve quality and to make care safer

3. Unit Cost Reductions: to make care more cost effective

4. Staff Satisfaction: to remain a great place to practice

Page 7: Developing the Dashboard

7

Before During Post

Where Are There Opportunities to Reduce Costs Across These Processes of Care?

Admission Process: ED,

Direct Admits,

Transfers

Patient Stay; Direct Patient Care, Tests, Treatments, Procedures,

Clinical Support, Operational Support

Discharge Process

Post Discharge Care

Preadmission Care

Support Functions: Finance, Information Systems, HR

Goal: High-performing interdisciplinary teams that deliver safe, effective, timely, efficient and equitable care that is patient and family centered.

“Patient Journey” Framework

Page 8: Developing the Dashboard

- 8 -

Before During After

Admission process: ED, direct admits,

transfers

Patient stay; direct patient care; tests; treatments; procedures;

clinical support; operational support

Discharge process

Post-discharge

care

Pre-admission

care

Inte

rve

nti

on

Inte

rve

nti

on

Inte

rve

nti

on

Inte

rve

nti

on

Inte

rve

nti

on

Inte

rve

nti

on

Inte

rve

nti

on

Inte

rve

nti

on

Innovations in Care Delivery “Patient Journey” Framework – Initial 15 Interventions

Relationship-based care ♦ The Attending Nurse role ♦ Hand-Over Rounding Checklist

Discharge Planning: -Est. discharge date -Discharge disposition

Welcome Packet (notebook and discharge envelope)

Domains of PracticeDaily Interdisciplinary Team RoundsElectronic Unit WhiteboardsIn-Room WhiteboardsSmart Phones Wireless laptop computers/tabletsBusiness cards Hourly roundingQuiet hours

Discharge -Follow-up Call Program

Goal: High-performing, inter-disciplinary teams that deliver safe, effective, timely, efficient, and equitable care that is patient- and family-centered

Copyright MGH 2012Copyright MGH 2012

Page 9: Developing the Dashboard

9

Focus on Empirical Outcomes

• Focus on “What difference have you made?”

• Shift from structure and process to outcomes.

• Key indicators that paint a picture of the organization.

Page 10: Developing the Dashboard

10© American Nurses Credentialing Center

Donabedian Model

Donabedian, 1966; 1990

Page 11: Developing the Dashboard

Innovation Cluster Focus Areas *

Communication

Patient Engagement

Roles & Structures

Interventions ** Throughout Admission

Relationship-Based CareAttending Nurse Handover Rounding Checklist

Pre-AdmissionPre-Admit Data CollectionWelcome Packet

During AdmissionDomains of Practice Interdisciplinary RoundsBusiness CardsQuiet HoursHourly RoundingElectronic White BoardsIn Room White BoardsSmart PhonesHand Held/ Tablets

Post-DischargeDischarge Follow-up Phone Calls

Others as identified

Ed

uc

ation

Evaluation(Pre, During, Post)

Quantitative Qualitative•HCAHPS

•Leadership Influence over Professional Practice Environments (LIPPES)

•LOS

•Quality Indicators

•Patients Perceptions of Feeling Known (PPFKN)

•Readmissions

•Revised Perceptions of Practice Environment Scale (RPPE)

•Cost per Case Mix

•Staff Retention

•Focus Groups (Staff, Patients, Families, etc)

•Observations

•Narratives

•Survey of the Innovation Unit Expectations (SIUE-pre)

•Survey of the Innovation Unit Experiences (SIUE-post)

* The clusters are a lens with which we gain perspective on any particular intervention.

** May apply to any or all 3 of the cluster focus areas June 2013

Other measures as identifiedOther measures as identified

– Evaluation– Evaluation

Page 12: Developing the Dashboard

12

Why a Dashboard

“Rapid Improvement in any field requires measuring results…”

Porter, Lee

Page 13: Developing the Dashboard

13

Data and Information

Data are individual facts, statistics or items of information. (http://dictionary.com)

Information is the result of processing, manipulating and organizing data in a way that adds to the knowledge of the person receiving it. (http://en.wikiquote.org)

InformationInformationDataData

Page 14: Developing the Dashboard

14

Challenge

Create an easy to use dashboard tool Implement quickly Consolidate Key Performance Indicators (KPIs) from

multiple hospital sources Provide visibility of data across Innovation Units Use current benchmarks to measure performance Foster data transparency Drive improvement through PDSA Cycle

Supporting change with data Testing changes Spreading improvements

Page 15: Developing the Dashboard

15

Dashboard strategy

Begin with end in mind Know your customers/understand how they use

information Know questions dashboard is trying to answer When thinking about metrics, make sure you can

actually collect the data Develop a draft, engage users, and iterate Disseminate to users Refine as needed Periodically revisit dashboard needs as they relate to

ongoing measurement plan.

Page 16: Developing the Dashboard

16

Dashboard tactics Create shell, simple mockup

Make sure all functions and levels of info are represented. Make sure it is feasible to obtain all the data. Figure out who will be doing the data aggregation and preparation. These steps will help determine scope.

Fill in shell with metrics Complete prototype (in Excel) Consolidate draft metrics from multiple sources into single, concise, printable view Highlight performance relative to benchmarks with visual displays Refine structure and design, including time periods for reporting

Demo/pilot dashboard Helps set expectations with users Validates format and metrics

Document business requirements Includes calculation of metrics and description of benchmarks and data sources

Plan for updates

Page 17: Developing the Dashboard

17

QUALITY AND SAFETY

Patient-Centered Outcome MeasuresFalls per 1,000 Patient Days

Total Fall Rate 2.78 3.15 3.55 0.00 0.95 0.00 3.14 1.55 0.00 0.47 1.36 Worse NA Better

Observed 7 9 10 0 1 0 7 2 0 1 3 NA NA NA

Falls with Injury per 1,000 Patient DaysFalls with Injury Rate 0.79 0.70 0.35 0.00 0.00 0.00 0.00 0.77 0.00 0.00 0.00 Worse NA Better

Observed 2 2 1 0 0 0 0 1 0 0 0 NA NA NA

Hospital Acquired (HA) Pressure UlcersTotal HA Pressure Ulcer Prevalence Rate 0.0% 3.3% 0.0% 0.0% 0.0% 0.0% 0.0% 7.1% NA 0.0% 0.0% Worse NA Better

Observed 0 1 0 0 0 0 0 1 0 0 NA NA NA

Hospital Acquired (HA) Pressure Ulcers Type II or GreaterTotal HA Pressure Ulcer Type II or Greater Prevalence Rate0.0% 3.3% 0.0% 0.0% 0.0% 0.0% 0.0% 7.1% NA 0.0% 0.0% Worse NA Better

Observed 0 1 0 0 0 0 0 1 0 0 NA NA NA

RestraintsTotal Restraint Prevalence Rate 0.0% 0.0% 0.0% 0.0% 0.0% 0.0% 0.0% 7.1% NA 0.0% 0.0% Worse NA Better

Observed 0 0 0 0 0 0 0 1 0 0 NA NA NA

Peripheral Intravenous (PIV) Infiltrations - Pediatric/NeonatalTotal PIV Infiltration Prevalence NA NA NA 0.0% 0.0% 0.0% NA NA NA NA NA Worse NA Better

Observed 0 0 0 NA NA NA

Central Line-associated Bloodstream Infections per 1,000 Line Days (CLABSI)Total CLABSI Rate 0.00 1.73 0.00 3.40 0.00 0.00 2.55 2.84 NA 0.00 4.85 >1 NA 0 or 1

Observed 0 1 0 1 0 0 1 3 0 1

Color Shading relative to Benchmark:

Massachusetts General Hospital - PCS Innovation Unit Dashboard At a Glance

Unit Unit

Rate is better (lower) than benchmark.

Rate is worse (higher) than benchmark.

UnitUnitICU ICUMeasures

NDNQI

NDNQI

NDNQI

NDNQI

NHSN

Unit UnitUnit Unit UnitBench-mark

NDNQI

Color Shading Relative to Benchmark

NDNQI

Surg Adult Benchmark 2.57

Critical Care Adult Benchmark 0.99

17

Metric categories: Throughput and Efficiency Patient & Staff Satisfaction Quality and Safety Infection Control Patient Satisfaction Staff Satisfaction

Dashboard Development

Innovation Unit Dashboard sample

Goal: Measure the impact of Innovation Units’ interventions Tactic:Reliably store & communicate evaluation data

Goal: Measure the impact of Innovation Units’ interventions Tactic:Reliably store & communicate evaluation data

Page 18: Developing the Dashboard

18

NSI Reporting, Examples Pre-Innovation unit launch (Summer 2011)

Reporting for select metrics through Excellence Every Day Portal Quarterly unit level data and charts developed and stored in Shared File

Area for RN Leadership (Printed color copies delivered to units)

Sample unit level report for CAUTI metricSample unit level report for CAUTI metric

Sample Shared File area folder structure.

Sample Shared File area folder structure.

Sample Excellence Every Day Portal PageSample Excellence Every Day Portal Page

Page 19: Developing the Dashboard

19

What we did: Pre-Innovation unit launch

Identified need for robust, comprehensive, tool for Nursing Sensitive Indicator (NSI) reporting.

PCS had an initial Executive Committee Dashboard in place. Talked with internal experts for Strategic Performance Indicator

reporting.

J un 10 Jun 10 Jun 10 Jun 10 Jun 10 Jun 10 Jun 10Sep 10 Mar 10 Sep 10 Mar 10 Sep 10 Mar 10 Sep 10 Mar 10 Sep 10 Mar 10 Sep 10 Mar 10 Sep 10 Mar 10

Dec 09 Dec 09 Dec 09 Dec 09 Dec 09 Dec 09 Dec 0997% 100% 99% 97% 98% 96% 100%99% 100% 94% 98% 98% 95% 99%97% 100% 99% 100% 99% 96% 97%95% 96% 94% 98% 92% 96% 94%95% 92% 95% 99% 94% 93% 93%95% 89% 90% 93% 95% 98% 94%97% 97% 98% 99% 97% 98% 97%99% 99% 94% 97% 96% 98% 98%96% 96% 96% 98% 98% 94% 97%98% 100% 100% 100% 100% 98% 95%98% 94% 100% 96% 97% 90% 80%

100% 100% 100% 100% 100% 91% 93%98% 100% 99% 100% 98% 93% 95%98% 94% 94% 96% 97% 90% 80%99% 100% 100% 100% 100% 91% 93%

3.4 SCIP Wound Infection Composite

95%

92% 97%

97%

100% 96%

90%

99%

96%93%100%

97% 99% H 94%97% 98% 97%

94%

96%

3.5 SCIP-VTE, Rate of VTE Prophylaxis:

3.3 PNE Composite

H99%

HPM Metric

H/L

97%

90%

99% 100%

Hospital Hospital

97%98% 98%

Leader

HospitalHospital Hospital Hosptial Hospital

Leader Leader Leader

H

94%

3.2 CHF Composite

better

Current PeriodColor Scoring

Thresholds

@ orH=Higher is better; L=Lower is better

97% H

Leader Leader

99%

97%98%

96%

99% 97%97%

100%

100%

98%

92% 97%99%

H 97% 100%

100%

3.

Un

ifo

rm H

igh

Qu

ali

ty

Ordered

Received

Note: Sample dashboard for demonstration purposes only.Note: Sample dashboard for demonstration purposes only.

Page 20: Developing the Dashboard

20

What we did: Metric selection

Interventions Source1. Improve Patient Experience

Nurse CommunicationQuiet at NightResponsivenessCleanlinessPain ManagementOverall ratingDischarge Information

2. Improve Quality Decrease Hospital Acquired Conditions Relationship-based care

CAUTI Attending NurseFalls with Injury Handover Rounding checklistCLABSI Domains of PracticeCentral Line Infections Interdisciplinary RoundsPressure Ulcers Business Cards

Restraint Utilization Quiet hoursPeripheral Intravenous (PIV) Infiltrations Hourly Rounding

Electronic WhiteboardsSmart Phones

Service Excellence, HCAHPS dataHourly RoundingQuiet HoursPain Tiger TeamHotel Style CleaningSmart Phones and WhiteboardsPatient/Family notebookDischarge EnvelopeDischarge phone callsUnit based patient advocate

Goals and Metrics

Infection Control, Patient Care Services Office of Quality and Safety

Page 21: Developing the Dashboard

21

What we did: Metric selection, continued

Interventions Source3. Reduce Costs

Direct Cost per Case mix adjusted discharge Follow up phone callsLabor Expense HandoversHours Worked per Equiv Patient Day White boardsMedical Supply Expense Smart PhonesTotal Expense Discharge Envelope

4. Maintain Staff SatisfactionProfessional Practice Environment Staff Relationship-based care

Staff Perception Survey Mean Scores Attending Nurse RoleNDNQI RN Survey Practice Environment Domains of Practice

Scale--Nursing Work Index Mean Score

Readmissions Standardized processesALOS Use of TechnologyAdmission/ ED Admit Volume Controlling Variation

Implementing Evidence-based practiceSafe Handover Communication

Goals and Metrics

Finance Department, Admitting Department, Center for Quality and Safety

Optimize Efficiency/Throughput

Institute for Patient Care

PHS Finance, MGH Finance

Page 22: Developing the Dashboard

22

What we did: Data Sources & Benchmarks

PHS Finance (EPSI)PHS Finance (EPSI)

Patient Care Services(NDNQI)

Patient Care Services(NDNQI)

Infection Control (CDC)

Infection Control (CDC)

PCS Institute for Patient Care

(Staff Perception Surveys)

PCS Institute for Patient Care

(Staff Perception Surveys)

Admitting (PATCOM)Admitting (PATCOM)

ED Information System(EDIS)

ED Information System(EDIS)

MGH Finance (EPSI/Action OI)MGH Finance

(EPSI/Action OI)

Service Excellence(HCAHPS)

Service Excellence(HCAHPS)

Ellison 17 Ellison 18

QUALITY AND SAFETY

Patient-Centered Outcome MeasuresFalls per 1,000 Patient Days

Total Fall Rate 4.50 1.46 4.95 0.77 1.92 1.32 2.16 1.79 TBD 0.65 4.85 0.45Observed (N) 11 3 13 1 2 2 5 2 2 10 1

Falls with Injury per 1,000 Patient DaysFalls with Injury Rate 0.41 0.49 1.52 0.00 0.96 0.00 0.00 0.89 TBD 0.00 1.45 0.45Observed (N) 1 1 4 0 1 0 0 1 0 3 1

Hospital Acquired (HA) Pressure UlcersTotal HA Pressure Ulcer Prevalence Rate 0.0% 0.0% 6.9% 0.0% 0.0% 0.0% 0.0% 7.7% TBD NA 4.8% 4.2%Observed (N) 0 0 2 0 0 0 0 1 1 1

Hospital Acquired (HA) Pressure Ulcers Type II or GreaterTotal HA Pressure Ulcer Type II or Greater Prevalence Rate0.0% 0.0% 0.0% 0.0% 0.0% 0.0% 0.0% 7.7% TBD NA 4.8% 4.2%Observed (N) 0 0 0 0 0 0 0 1 1 1

RestraintsTotal Restraint Prevalence Rate 0.0% 0.0% 0.0% 0.0% 0.0% 0.0% 0.0% 7.7% TBD NA 0.0% 0.0%Observed (N) 0 0 0 0 0 0 0 1 0 0

Peripheral Intravenous (PIV) Infiltrations - Pediatric/NeonatalTotal PIV Infiltration Prevalence NA NA NA 0.0% 0.0% 0.0% NA NA NA NA NA NAObserved (N) 0 0 0

Central Line-associated Bloodstream Infections per 1,000 Line Days (CLABSI)Total CLABSI Rate 6.54 NA 1.36 2.90 4.76 0.00 1.10 1.70 TBD NA 0.00 0.00Observed (N) 1 1 1 1 0 1 2 0 0

Note: metrics to be reported beginning FY 2012 Color Shading relative to Benchmark:Catheter-associated Urinary Tract Infections per 1,000 Device DaysVentilator-associated Pneumonia per 1,000 Vent Days

Massachusetts General Hospital - PCS Innovation Units Dashboard

Rate is better (lower) than benchmark.

Rate is worse (higher) than benchmark.

VascularBigelow 14

ObstetricsBlake 13

ICUBlake 12

NICUBlake 10

CICUEllison 9

MeasuresOrtho

White 6OncologyLunder 9

MedicineEllison 16

Pediatrics SurgeryWhite 7

PsychBlake 11Individual Units listed across topIndividual Units listed across top

Service Excellence(Pediatric Survey)Service Excellence(Pediatric Survey)

CQS(EPSI - Readmissions)

CQS(EPSI - Readmissions)

Many sources and contacts Many (many) data formats

Page 23: Developing the Dashboard

23

What we did: Dashboard notes

Defines metric, source, contact, frequency.

Notes: Calculation Benchmark Data Source Contact FrequencyTotal Fall Rates Number of Patient Falls with Injury is the

number of events reported that resulted in patient injury, calculated per 1000 patient days. Lower values are better performance. At this time there are no unit-based benchmarks for patient falls so we are using Benchmarks from Partners HPM. In September 2010, PCS Quality & Safety began to submit data to NDNQI and unit based benchmarks will soon be available.

The number of events reported, calculated per 1000 patient days.

NDNQI Incident reports from RL Solutions

Nancy McCarthy Quarterly

Falls w/ Injury Rates Number of Patient Falls with Injury is the number of events reported that resulted in patient injury, calculated per 1000 patient days. Lower values are better performance. At this time there are no unit-based benchmarks for patient falls so we are using Benchmarks from Partners HPM. In

The number of events reported that resulted in patient injury, calculated per 1000 patient days.

NDNQI Incident reports from RL Solutions

Nancy McCarthy Quarterly

HA - Pressure Ulcer Rates--ALL

Quarterly pressure ulcer incidence rate data is collected in a one-day prevalence study. # of hospital acquired pressure ulcers (any stage). Lower values are better.

Numerator = # of hospital acquired, stage II or greater pressure ulcers. Denominator = Total # of discharges?

NDNQI One-day prevalence / incidence study conducted by nursing reps from PCS Office of Quality and Safety.

Nancy McCarthy Quarterly

HA - Pressure Ulcer Rates Stage II or greater

Quarterly pressure ulcer incidence rate data is collected in a one-day prevalence study. # of hospital acquired, stage II or greater pressure ulcers. Lower values are better.

Numerator = # of hospital acquired, stage II or greater pressure ulcers. Denominator = Total # of discharges?

NDNQI One-day prevalence / incidence study conducted by nursing reps from PCS Office of Quality and Safety.

Nancy McCarthy Quarterly

Restraint Rate Restraint prevalence % of Patients in restraints NDNQI One-day prevalence / incidence study conducted by nursing reps from PCS Office of Quality and Safety.

Office of Quality and Safety

Quarterly

Peripheral Intravenous Inflitrations

NEW; Pediatric and Neonatal populations Total PIV Infiltration Point Prevalence. Total number of PIV infiltrations (Grades 2-4) on the unit divided by the total number of PIV sites on a unit. For children less than age 10, a Grade 1 infiltration is defined identically to a Grade 2.

NDNQI One-day prevalence / incidence study conducted by nursing reps from PCS Office of Quality and Safety.

Office of Quality and Safety

Quarterly

CLABSI Infection Rate Quarterly Line Infection incidence rate Numerator = # Line Infections (hospital acquired). Denominator = 1000 line days

NHSN Pooled Mean

Infection Control Paula Wright, Irene Goldenshtein

Quarterly

Massachusetts General Hospital - PCS Innovation Unit Dashboard Notes

Metric

Page 24: Developing the Dashboard

24

Success of Dashboard Tool

Initial dashboard pushed out when (12) Innovation Units launched. Expanded dashboard as other phases rolled out.

Accessed centrally in Shared File Areas and on Intranet.

“I post the dashboard on our Communication Board on our unit.”

“I love the opportunity to be transparent with my staff—it facilitates ownership of the clinical practice and an understanding of the global picture.”

“It inspires my staff to ask questions about what we could be doing differently.”

“I used the dashboard as part of the rollout of interventions.”

“Data are key for having conversations with my staff.”

Page 25: Developing the Dashboard

25

Using Data to Tell Stories

Quantitative data never tell the full story

Project outcomes measured with qualitative themes from interviews and observations as well

Promote narrative culture

Not everything that can be counted counts,

and not everything that counts can be

counted.

Albert Einstein,

Physicist

Not everything that can be counted counts,

and not everything that counts can be

counted.

Albert Einstein,

Physicist

Page 26: Developing the Dashboard

265

Page 27: Developing the Dashboard

27

Page 28: Developing the Dashboard

28

Future Considerations

Involve stakeholders in development process Review and revise list of metrics

Simplify Identify “need to know” vs. “nice to know” data

Connect dashboard with trend information Look beyond red/yellow/green Include graphical and visual representations of data Provide detailed notes and caveats Maintain data integrity Automate (to extent possible)

Page 29: Developing the Dashboard

29

Critical Success Factors

Shared vision Leadership engagement Agreement on metrics and data definitions Accountability

Page 30: Developing the Dashboard

30

Resources

Edward Tufte, www.edwardtufte.comEdward Tufte, www.edwardtufte.com

Page 31: Developing the Dashboard

31

Questions?

“Discovery consists in

seeing what everyone else has seen

and thinking what no one else has thought.”Albert Szent-Gyorgyi, Hungarian Biochemist, 1937 Nobel Prize Winner