Measurement

65
Measuring our success …Proving our improvement Sally Batley 23 June 2015

Transcript of Measurement

Page 1: Measurement

Measuring our success…Proving our improvementSally Batley

23 June 2015

Page 2: Measurement

Accelerating Innovation2

What will I get out of ‘measuring our success… proving our improvement workshop’• A good understanding

of measurement for improvement

• A few what not to do’s• A lot of what you

should do! • A measurement

checklist to take away and use

• A little bit of history

• Some time to work through your own improvement measures

• Group work with people who can help challenge and confirm your thinking

• Discussion around the HF measurement framework

Page 3: Measurement

Directing our efforts at delivery

3

Taken from Brent James

Page 4: Measurement

4

Responsibility for Change

Do with…

Page 5: Measurement

The measurement grid

5

Quantitative

‘numbers’

Qualitative

‘stories’

Our outputs NHS outcomes

showing the relationship between types of measure and outputs/outcomes

Page 6: Measurement

6

Control Level

Learning Environmental context

Organisational context

Microsystems

Patient and Community

Adapted from Brent James

Page 7: Measurement

7

Top down• Taylor – “hardly a competent

workman can be found”• Criticise / control

– (helpfully) point out mistakes– “power over”– re-educate

• Judgement (playing God)• Heroic individualism – the “lone

Ranger” syndrome• Unfunded mandates – layered on

top; assumes unlimited time / attention / resources

• Motivate / incentivise

Bottom up• Deming – almost all failures arise from

underlying processes• Empowerment

– drive out fear; put joy into work– “power to” (shared vision– Supply vision, tools; facilitate

• Learning (a “servant king”)• Teams – with fundamental knowledge• Integrated tools – carefully built into

workflow• Make it easy to do it right (align

incentives)Taken from Brent James

Page 8: Measurement

Challenge 1

“What information consumes is rather obvious: it consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention, and a need to allocate that attention efficiently among the overabundance of information sources that might consume it.”

Herbert Simon 1916-2001 Scientist

8

Page 9: Measurement

Assumption to assurance

• ASSUMPTION – A proposition that is taken for granted, that is, as if it were known to be true, used to draw a conclusion

• ASSURANCE – A declaration to inspire full confidence, freedom from doubt, a conclusion based on evidence

Middle/junior management/clinical focus

Senior/Executive Management/clinical focus

Information/Intelligence potential gap

9

Page 10: Measurement

Journey to understanding

10

Page 11: Measurement

It is what you do with it that matters!

11

Page 12: Measurement

Our job – eliminate the noise

12

Page 13: Measurement

But we could just add to it…

13

Page 14: Measurement

How do we stop serving terrible up?

14

Page 15: Measurement

15

Measurement Sins

Measuring the wrong thing

Having no baseline (or having a compromised

one)

Only collecting data at 2 points in time

Presenting results in a misleading way

Inappropriate or mindless use of

statistics

Page 16: Measurement

Accelerating Innovation

What is the difference between data and information?

16

Page 17: Measurement

Information can be of immediate use to the end user, where data needs to be ‘processed’

• If the plane you were on was crash landing, would you want “Data” or “Information”?

17

Page 18: Measurement

What accuracy is required?• Q: What is the position of the space shuttle in orbit?• A: 115 – 400 miles above the earth

NASA: needs to know to the nearest metre? GCSE physics student: to the nearest 10 km? Shuttle crew manoeuvring to engage with a

satellite needs …

Relativedistancematters!

?

18

Page 19: Measurement

Accelerating Innovation

Journey of Patients/Clients

Stay

Re-assess & aftercare

Citizens/clients/ patients

Hospital treatment

Visit

PlannedEntry

EmergencyEntry

clinical support processes diagnostic, medication, treatment, theatres

management processesInformation, improvement, IT, purchasing, distribution, HR

PlannedExit Home

Self care Self care

Citizens/clients/patientsClients

Primary carePrimary care

19

Page 20: Measurement

20

Knowing when ‘sufficient is enough’ is not an exact science

Perfection

“Good enough”

Time available?

Diminishing returns

Quality

Time

Page 21: Measurement

The happy medium…

21

Page 22: Measurement

22

3 different points give you…"Upward Trend"?

0

1

2

3

4

1 2 3

"Setback"?

0

1

2

3

4

1 2 3

"Downturn"?

0

1

2

3

4

1 2 3

"Turnaround"?

0

1

2

3

4

1 2 3

"Rebound"?

0

1

2

3

4

1 2 3

"Downward Trend"?

0

1

2

3

4

1 2 3

Page 23: Measurement

The challenge for measuring for improvement

• Metrics: Focus on small set of key quality & cost metrics

• Comparisons: Use benchmarking data and transparency

• Compete: Create positive competition so that all can raise the bar & all are motivated to continuously improve on key success metrics

• Current Good Practices: Share widely what works best

23

Page 24: Measurement

Components of the system which we should and can measure to see the improvement picture

24

Page 25: Measurement

Accelerating Innovation

‘Data should always be presented in such a way that preserves the evidence in the data…’

Walter Shewhart

25

Page 26: Measurement

Two point comparisons

26

Q2 03/04 Q2 04/05 variance % varianceGen Surg 1897 1835 -62 -3.3Urology 1769 1758 -11 -0.6ENT 521 570 49 9.4T&O 1904 1945 41 2.2Ophthal 391 300 -91 -23.3Oral Surg 274 328 54 19.7Gynae 631 600 -31 -4.91Total 7387 7336 -51 -0.7

Page 27: Measurement

Different numbers

27

Given two different numbers, one will always be bigger than the other

Somethingvery

important!

Lastmonth

Thismonth

Page 28: Measurement

Every picture tells a story … Does it?

Reasons for Delayed Transfer August 2003

%AwaitAss<7 days9%

%AwaitAss >7days9%

%Await Public Funding5%

%Await Further NHS care12%

%Await Residential25%

%Await Domiciliary package

12%

%Patient Family choice18%

%Other Reasons10%

Looks pretty – but cannot show

change over time

28

Page 29: Measurement

The real use of a pie chart

29

Page 30: Measurement

Trust Performance (mins)

Target (mins)

Trust A 40.8 40

Trust B 39.1 40

Trust C 35.95 40

Does every picture tell the right story?

1) Be careful of averages

2) Less data points less variation can be seen and understood

30

Page 31: Measurement

31

Trust A

0

10

20

30

40

50

60

70

80

Trust B

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Trust C

0

10

20

30

40

50

60

70

80

Stu

dent

on

cour

ses

Stu

dent

s on

cou

rse

Stu

dent

s on

cou

rse

Accelerating Innovation

Page 32: Measurement

32

Looking at measurement for improvement• Its more than just data• Getting a message across• Consistent• Easy to interpret• Something you can act on• Evidence based• Should stand alone with little reader interpretation

Accelerating Innovation

Page 33: Measurement

You have been asked to prepare a report on the following

33

2013Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec151 147 111 167 114 106 153 111 150 123 127 145

2014Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec170 198 159 176 141 176 132 132

Accelerating Innovation

Page 34: Measurement

Is this better?

MonthNumber of Admissions Difference Month

Number of Admissions 03 v 04 % change

Apr-03 151 Apr-04 170 19 13May-03 147 -4 May-04 198 51 35Jun-03 111 -36 Jun-04 159 48 43Jul-03 167 56 Jul-04 176 9 5

Aug-03 114 -53 Aug-04 141 27 24Sep-03 106 -8 Sep-04 176 70 66Oct-03 153 47 Oct-04 132 -21 -14Nov-03 111 -42 Nov-04 132 21 19Dec-03 150 39Jan-04 123 -27Feb-04 127 4Mar-04 145 18Apr-04 170 25

May-04 198 28Jun-04 159 -39Jul-04 176 17

Aug-04 141 -35Sep-04 176 35Oct-04 132 -44Nov-04 132 0

Number of Admissions

0

50

100

150

200

250

Apr

-03

May

-03

Jun-

03

Jul-0

3

Aug

-03

Sep

-03

Oct

-03

Nov

-03

Dec

-03

Jan-

04

Feb-

04

Mar

-04

Apr

-04

May

-04

Jun-

04

Jul-0

4

Aug

-04

Sep

-04

Oct

-04

Nov

-04

Month of Admission

No. A

dmis

sion

s

34Accelerating Innovation

Page 35: Measurement

Even better?Number of Admissions April 2003 - November 2004

0

50

100

150

200

250

Apr

-03

May

-03

Jun-

03

Jul-0

3

Aug

-03

Sep

-03

Oct

-03

Nov-

03

Dec-

03

Jan-

04

Feb-

04

Mar

-04

Apr

-04

May

-04

Jun-

04

Jul-0

4

Aug

-04

Sep-

04

Oct

-04

Nov-

04

Month

No. o

f Adm

issi

ons

35Accelerating Innovation

Page 36: Measurement

What about an SPC chart?

Performance Report - Number of Admissions

0

50

100

150

200

250

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Data Average (144.5) Lower limit (66.5) Upper limit (222.4)

SPC Charts plot variation over time

36Accelerating Innovation

Page 37: Measurement

Better still with more data points?Admissions

Weekly totals from 1 December 2003

0

5

10

15

20

25

30

35

40

45

50

01 D

ec 0

3

01 J

an 0

4

01 F

eb 0

4

01 M

ar 0

4

01 A

pr 0

4

01 M

ay 0

4

01 J

un 0

4

01 J

ul 0

4

01 A

ug 0

4

01 S

ep 0

4

01 O

ct 0

4

01 N

ov 0

4

Admissions Average (35.0)

Lower limit (25.1) Upper limit (44.9)

37Accelerating Innovation

Page 38: Measurement

Accelerating Innovation

A framework for the measurement and monitoring of safety

Source: Vincent C, Burnett S, Carthey J. The measurement and monitoring of safety. The Health Foundation, 2013.www.health.org.uk/publications/the-measurement-and-monitoring-of-safety

Page 39: Measurement

Components of the system which we should and can measure to see the improvement picture

Outcome measures show the impact

Process measures show how well we do what we say we do

Balancing measures show any unintended consequences

39

Input measures show what we need to do

Output measures show what we accomplished

Accelerating Innovation

Page 40: Measurement

Measurement for improvement steps

40Accelerating Innovation

Page 41: Measurement

Questions you may be asked

• Why are you showing me this?• What was the sample size?• Over what period was the data collected?• Is this all of the data? (what did you leave out?)• If a target is shown, how was it established?• How was the data collected?• Who collected the data?• Did you encounter any problems gathering the data?• If the data is aggregated, have you got the real data anywhere?• What conclusions have you drawn? How?• What do we need to do next?

41

Aim & Choose

Define

Collect

Analyse

Review

Accelerating Innovation

Page 42: Measurement

Your measures checklist

• Why is it important?• Who owns it?• Data definitions• Goals, what are we

achieving?• Collect, how is it going to be

collected• Analyse & present• Review, so what?

42Accelerating Innovation

Page 43: Measurement

Step 1. Decide your aim1 Decide aim

2 Choose measures

3 Define measures

6 Review measures

5 Analyse & present

7 Repeat steps

4-64 Collect

data

A worthwhile topic Outcome focused Measurable Specific population Clear timelines Succinct but clear

Specific Measurable Achievable Realistic Time-bound

Adapted from Tom Nolan in The Improvement Guide

Can you write your aim in a

sentence?

43Accelerating Innovation

Page 44: Measurement

Step 2. Choose measures1 Decide aim

2 Choose measures

3 Define measures

6 Review measures

5 Analyse & present

7 Repeat steps

4-64 Collect

data

44Accelerating Innovation

Page 45: Measurement

Useful tools when choosing measures

45

Process maps

Driver diagrams

Accelerating Innovation

Page 46: Measurement

• Think of an improvement project you are working on right now

• What is your aim/objective?• What outcome measure is related to your

aim?• What is your output measure?• What process measure(s) link to those?• What is your input measure?• What should your balancing measure be?

46

Grab a flip chart

You have 10 minutes

Accelerating Innovation

Page 47: Measurement

Step 3. Define measures1 Decide aim

2 Choose measures

3 Define measures

6 Review measures

5 Analyse & present

7 Repeat steps

4-64 Collect

data

An operational definition is a description, in quantifiable terms, of what to measure and the steps to follow to measure it consistently

47Accelerating Innovation

Page 48: Measurement

Step 4. Collect data

• What – All people/patients, a portion or a sample?

• Who – collects the data? • When – is it collected

– real time or retrospective?• Where – is it collected?• How – is it obtained

– Computer system or audit?

1 Decide aim

2 Choose measures

3 Define measures

6 Review measures

5 Analyse & present

7 Repeat

steps 4-6

4 Collect data

Accelerating Innovation48

Page 49: Measurement

Step 5. Analyse and present

‘The type of presentation you use has a crucial effect on how you and others react to

data’

1 Decide aim

2 Choose measures

3 Define measures

6 Review measures

5 Analyse & present

7 Repeat steps

4-64 Collect

data

49Accelerating Innovation

Page 50: Measurement

How we assess performance:RAG ratings

Sep 11 Oct 11 Nov 11 Dec 11 Jan 12 Feb 12 Mar 12 Apr 12 May 12 Jun 12 Jul 12 Aug 1290 97 77 93 76 84 76 89 84 84 93 70

Why has performance deteriorated so badly. What decision are you going to make?

50Accelerating Innovation

Page 51: Measurement

Accelerating Innovation

Indicator

YTD PerfVs

Target

Perf Trend -Sustainabilit

y (latest 3mths)

Exception

Report Produced

Perf View on

Quality of Plan

Improve-Date set by

Owner/In-Month Performance

Target Owner

Risks/Comments and likely delivery

against Improvement date

Position vs. last month

& PMO Monitor

NoF G G Not required Not required G CH

Patient SafetyPerf Notice Rec

Loss of Income in 2013/14

Improvement Date slippage

A & E- 4 hours R R G Not required G CH

Patient SafetyPerf Notice Rec

Loss of income in 2013/14

A & E- CQIs A A G A A CH

Patient SafetyPerf Notice Rec

Loss of Income in 2013/14

CQC visitsRegulatory issues

Stroke Unit

- 90%G G Not

required Not required R CH

Patient SafetyIncreased risk of perf measures. Feb has

met target – and sustained

HSMR G G Not required Not required Not Req’d RC-H

CDiff A A G Not required R CO

Patient SafetyCQC/Regulatory

Issues

Performance Overview – October2014

51

Page 52: Measurement

Accelerating Innovation

“If I could reduce my message to management to just a few words, I'd say it all has to do with reducing variation”.

W Edwards Deming

52

Page 53: Measurement

Walter A. Shewhart(early 1920’s, Bell Laboratories)

• While every process displays variation:• some processes display controlled variation

(common cause)– stable, consistent pattern of variation– constant causes/ “chance”

• while others display uncontrolled variation– pattern changes over time– special cause variation/“assignable” cause

53Accelerating Innovation

Page 54: Measurement

Other Quality Management Guru’s’

• Dr Joseph M. Juran– Best known for adding the ‘human’ dimension to quality– Conceptualised the Pareto principle– The Juran Triology

– Quality control, quality improvement and quality planning– Also worked with Western Electric and Bell Technologies

• Dr Myron Tribus‘Managing a company by means of the monthly report is like trying to drive a car by watching the yellow line in the rear view mirror.’

– Many roles including: -– Senior Vice President for Research & Engineering at Xerox

54Accelerating Innovation

Page 55: Measurement

Accelerating Innovation

Indicator

YTD PerfVs

Target

Perf Trend -

Sustainability

(latest 3mths)

Exception

Report Produce

d

Perf View on

Quality of Plan

Improve-Date set by Owner/In-

Month Performance

Target Owner

Risks/Comments and likely delivery

against Improvement date

Position vs. last month

& PMO Monitor

NoF G G Not required Not required G CH

Patient SafetyPerf Notice Rec

Loss of Income in 2013/14

Improvement Date slippage

A & E- 4 hours R R G Not required G CH

Patient SafetyPerf Notice Rec

Loss of income in 2013/14

A & E- CQIs A A G A A CH

Patient SafetyPerf Notice Rec Loss of Income in

2013/14CQC visits

Regulatory issues

Stroke Unit

- 90%G G Not

required Not required R CH

Patient SafetyIncreased risk of

perf measures. Feb has met target – and

sustained

HSMR G G Not required Not required Not Req’d RC-H

CDiff A A G Not required R CO

Patient SafetyCQC/Regulatory

Issues

Performance Overview – October2014

55

Page 56: Measurement

Not so peachy

Apr 2012

May 2012

Jun 2012

Jul 2012

Aug 2012

Sep 2012

Oct 2012

Nov 2012

Dec 2012

Jan 2013

Feb 2013

Mar 2013

Apr 2013

Month

50

60

70

80

90

100

percentage % patients achieving 90% time in stroke unit

BaseLine

VerdictStable within limits

(66 -100)

Not Capable of achieving target

consistently

56Accelerating Innovation

Page 57: Measurement

Step 6 – Review measures

It is a waste of time collecting and analysing

your data if you don't take action on the

results

Measurement can change behaviour, the silver bullet is it must be the right measurement

attached to the right story

1 Decide aim

2 Choose measures

3 Define measures

6 Review measures

5 Analyse & present

7 Repeat steps 4-6

4 Collect data

57Accelerating Innovation

Page 58: Measurement

What decision do you make?

Is your information presented in a way that allows you to confidently make one of these decisions?

Decision Because

Do nothing Performance ok

Contingency plans Something out of the ordinary has happened ‘special cause variation’

Process redesign We are not capable of achieving to the agreed expectation ‘common cause variation’

58Accelerating Innovation

Page 59: Measurement

Types of calculation:When to use what

• Counts– when the target population does not change very much– Example: Number of falls on an elderly ward (always full)

• Percentages– when the numerator is a subset of the denominator– Example: Percentage of patients who fell

• Ratios or rates– Numerator and denominator are measuring different things– Example: Falls per 100 bed days

• Time between or cases between– When you are tracking a ‘rare’ event, say one that occurs less than

once a week on average– Example: Days since a patient last fell on this ward

59Accelerating Innovation

Page 60: Measurement

Accelerating Innovation

Plan vs reality

60

Page 61: Measurement

Looking at measurement for improvement

• Its more than just data• Getting a message across• Consistent• Easy to interpret• Something you can act on• Evidence based• Should stand alone with little reader interpretation

61Accelerating Innovation

Page 62: Measurement

Accelerating Innovation

Has patient care been safe in the past? Ways to monitor harm include:• mortality statistics (including HSMR and SHMI)• record review (including case note review and the

Global Trigger Tool)• staff reporting (including incident report and ‘never

events’)• routine databases.

Are our clinical systems and processes reliable? Ways to monitor reliability include:• percentage of all inpatient

admissions screened for MRSA • percentage compliance with all

elements of the pressure ulcer care bundle.

Is care safe today? Ways to monitor sensitivity to operations include:• safety walk-rounds • using designated patient safety officers• meetings, handovers and ward rounds • day-to-day conversations• staffing levels• patient interviews to identify threats to

safety.

Will care be safe in the future? Possible approaches for achieving anticipation and preparedness include:• risk registers• safety culture analysis and safety

climate analysis• safety training rates• sickness absence rates• frequency of sharps injuries per month• human reliability analysis (e.g. FMEA)• safety cases.

Are we responding and improving? Sources of information to learn from include: • automated information

management systems highlighting key data at a clinical unit level (e.g. medication errors and hand hygiene compliance rates)

• at a board level, using dashboards and reports with indicators, set alongside financial and access targets.

Source: Vincent C, Burnett S, Carthey J.

The measurement and monitoring of safety. The Health Foundation, 2013

A framework for the measurement and monitoring of safety

Page 64: Measurement

Thank you for listening any further questions

[email protected] 229448

64Accelerating Innovation

Page 65: Measurement

References• The run chart basic reference

– “The run chart: a simple analytical tool for learning from variation in healthcare processes”; Perla R, Provost L, Murray S; BMJ Qual Saf 2011;20:46e51. doi:10.1136/bmjqs.2009.037895

• One of the best introductions to variation and SPC– Understanding variation, Don Wheeler, www.spcpress.com

, 1986• A couple of useful websites/blogs

– www.davisdatasanity.com – www.kurtosis.co.uk

65Accelerating Innovation