Post on 23-Feb-2016
description
1
Measuring Quality: Using Clinical Quality Indicators, Metrics and Dashboards to Measure Quality in Your Organisation
John Varlow, Director of Information AnalysisHealth and Social Care Information Centre
Environment and Context
• System wide changes:– A new system for commissioning, delivering, and
accounting for health, public health and social care outcomes
– New structures and responsibilities between NHS England, Public Health England, the Health and Social Care Information Centre (HSCIC), the Department of Health (DH) and Government
– Attempt at genuine devolution to local organisations – New regulatory functions for statutory bodies
The Quality Framework
NHS OUTCOMES FRAMEWORKDomain 1
Preventing people from
dying prematurely
Domain 2Enhancing the quality of life for
people with LTCs
Domain 3Recovery
from episodes of ill health /
injury
Domain 4Ensuring a
positive patient
experience
Domain 5Safe
environment free from avoidable
harm
NICE Quality Standards (Building a library of approx 150 over 5 years)
Commissioning Outcomes Framework
Commissioning Guidance
Provider payment mechanisms
Commissioning / ContractingNHS Commissioning Board – certain specialist services and primary care
GP Consortia – all other services
Duty of quality
Duty of quality
Dut
y of
qua
lity
tariff standard contract CQUIN QOF
NHS OUTCOMES FRAMEWORKDomain 1
Preventing people from
dying prematurely
Domain 2Enhancing the quality of life for
people with LTCs
Domain 3Recovery
from episodes of ill health /
injury
Domain 4Ensuring a
positive patient
experience
Domain 5Safe
environment free from avoidable
harm
NICE Quality Standards (Building a library of approx 150 over 5 years)
Clinical Commissioning Group Outcomes Indicator Set
Commissioning
Guidance
Provider payment mechanisms
Commissioning / ContractingNHS Commissioning Board – certain specialist services and primary care
GP Consortia – all other services
Duty of quality
Duty of quality
Dut
y of
qua
lity
tariff standard contract CQUIN QOF
Outcomes Frameworks
• NHS Outcomes Framework (NHSOF)• Clinical Commissioning Group Outcome
Indicator Set (CCGOIS)• Public Health Outcomes Framework
(PHOF)• Adult Social Care Outcomes Framework
(ASCOF)
Indicators in Context: What Can We Say?The HSCIC’s website lists over 3,000 indicators, alongside other products, yet covers only a part of the full range of clinical care. There are many more indicators in use locally. This is illustrative of the challenges we face in monitoring clinical quality.
EMERGENCY ADMISSIONS TO HOSPITAL FOR ACUTE CONDITIONS USUALLY MANAGED IN PRIMARY CARE, ALL AGES, ENGLAND 2007/08 (Source: NHS IC Compendium, Crown Copyright)
0100
200300
400500
600700
800900
1000
ONS Area Group / Local Authority
Indi
rect
ly a
ge s
tand
ardi
sed
rate
pe
r 10
0,00
0 &
95%
con
fiden
ce
inte
rval
s
The Move to Monitoring Outcomes
• Accountability shift from what is done, to what is achieved with available resources, demonstrating continuing improvement
• In the absence of evidence based standards for some services, comparative data, for example stroke deaths, may show that outcomes are less than optimal
• Evidence-based process indicators, for example those listed in NICE Quality Standards and the Outcomes Frameworks act as a proxy for outcomes
• An intervention now may have an impact years / decades in the future; an outcome now may reflect interventions going back years / decades
• Attribution and apportioning credit, hence accountability is likely to be difficult
What is a Metric?
A metric is a measure of a known attribute
eg a speedometer in a car dashboard eg within clinical care, a blood pressure
reading
Metrics, whether based on physical instruments or questionnaires, need rigorous testing and calibration plus precision in use
What is an Indicator?
An indicator describes how a measure is expected to be used to judge quality
includes clear statements about the intended goal / objective;
whether it is expected to be used in isolation or in combination with other measures or indicators;
any thresholds or standards which are expected to be applied
e.g. a gauge to show whether speed is within legal limits in a car dashboard
e.g. within clinical care, the proportion of patients with controlled high blood pressure
An indicator may act as an alert to an issue that needs further investigation
Indicator or Metric?
• Metric – number of emergency readmissions to an acute hospital trust following an appendectomy
• Indicator – rate of readmissions • Consider the context and may need to
take into account • whether the readmissions are
avoidable• co-morbidities• whether a certain number are
acceptable• casemix of patients
Indicator Development
• Is the indicator rationale supported by evidence?
• Does the indicator relate to clinical care or outcome that is influenced by actions of commissioners or providers?
• Has this aspect been identified as a priority?
• Can the indicator be developed so that it is measurable?
• Is there evidence of inappropriate variation in clinical care or outcomes?
• Could adoption of best practice significantly improve quality and outcomes?
• Is there scope for improvement?
Indicator Development
• Do you want/need to look at a single aspect of care or whole pathway?
• How will improvement be measured?
• Who is your intended audience?
• If you are comparing with other trusts are you comparing like with like?
• Do you need a simple or composite indicator?
• Provider or commissioner based?
• Longitudinal or cross sectional?
• Selection of number of indicators is not easy….
Deciding how many indicators to focus on
Single aspect eg renal dialysis versus whole pathway eg obesity, uncontrolled high blood pressure, kidney disease, QOL, deaths
Tension – too few may leave gaps and distort priorities, too many may overwhelm the organisation
Potential solution - hierarchies, with ability to drill down to detail, as necessary
Potential solution – menu, with ability to select those to be displayed in the dashboard
RISK
DISEASE / ILL HEALTH
ADVERSE EVENTS
QUALITY OF LIFE
PREMATURE DEATH
AVOIDING RISK
REDUCING RISK
TIMELY INTERVENTION
LATE INTERVENTION
Clinical Quality
Potential activities
Indicators: NICE Quality Standards
Information 5: Education and self-management
NHS Outcomes Framework
CCG Outcomes Indicator Set
Establishing Limits and Thresholds
• In any absence of evidence-based standards, it is important to establish a basis for judging quality and improvement
• The ‘National Average’ is not always the best marker as it combines good and poor quality
• It may be possible to arrive at some notion of ‘optimum’ based on best levels achieved elsewhere, for example cancer survival or emergency admissions in some parts of the country / other countries
• Dependent on clarity around purpose of indicator and audience e.g. clinician, patient, policy maker, manager, public etc.
Indicator Assurance Process
• Hosted on behalf of the whole system• Indicator Assurance Service• Standard indicator assurance templates• Methodology Review Group• Independent Peer Review• Indicator Assurance Process• Indicator Governance Board• National Library of Assured Indicators
– Repository
Indicator Assurance Process
Indicator Assurance Considerations
• Purpose of indicator
• Rationale, evidence based standard
• What is measured – numerator, denominator, construction, source of data, completeness of counts, quality of data
• How data are aggregated - type of analysis (direct/indirect standardisation), risk adjustment e.g. for age, gender, method of admission, diagnosis, procedure, co-morbidity etc. to compare ‘like’ with ‘like’
• Scientific validity – face, content, construct, criterion, predictive; validity for public, clinicians, performance
• Interpretation – identifying outliers, explaining observations
• Use – timeliness, gaming, costs, access, credibility, feasibility, usefulness
• Investigation and action – play of chance, artefacts (e.g. data quality), quality of care
Indicator Development and Assurance
• Skills and expertise from HSCIC and the wider system– Methodologists– Epidemiologists– Statisticians– Subject Matter Experts– Informatics Specialists– Measurement Specialists– Clinicians and Patients
Dashboards
• “All that glitters is not gold”Shakespeare – Merchant of Venice
• “Simplify, simplify, simplify!”Henry David Thoreau
• “Maximise the data – ink ratio”Edward R Tufte – The Visual Display of Quantitative Information
• “Unless you know what you’re doing you’ll end up with a cluttered mess”
Stephen Few – Information Dashboard Design: The Effective Visual Communication of Data
Dashboards: 13 Common Mistakes
• Exceeding a single screen• Supplying inadequate context• Displaying excessive detail or precision• Choosing deficient measures• Choosing inappropriate visualisation• Introducing meaningless variety• Using poor design• Encoding quantitative data inaccurately• Arranging the data poorly• Highlighting important data ineffectively• Cluttering with useless decoration• Misusing colour• Unattractive display
Clinical Quality Dashboards: Maternity
Accident and Emergency DashboardThe Rotherham NHS Foundation Trust Accident & Emergency Department Clinical Quality Indicators
Overview
95% of patients who needed admission to hospital waited under 389 minutes from
arrival to departure
In March 95% of patients waited under 43 mins for an initial assessment however the median time to the assessment was 9 mins. 6793 patients attended the A&E department in March which is 593 more than expected based on the average attendance being 200 per day. On average patients waited 74 mins to see a Doctor or Nurse Practitioner, 95% of patients waited less than 4hrs from arrival to departure and those admitted to hospital 95% waited less than 6hrs 29mins. There are several factors that influence the amount of time a patient has to wait before being admitted to a bed on a ward, one of these being the availability of a bed. This does not mean that we do not have enough beds to care for our patients, it may be the hospital is busier than normal or that to provide a safe and secure discharge for patients make take additional time. Those patients not admitted to hospital 95% waited less than 3hrs 49mins. Rotherham Hospital are committed to improving the service provided to the people of Rotherham and are always welcome to receive your views and ideas.
Summary of A/E Performance - March 2012
This dashboard presents a comprehensive and balanced view of the care delivered by our A&E department, and reflects the experience and safety of our patients and the effectiveness of the care they receive. These indicators will support patient expectations of high quality A&E services and allow our department to demonstrate our ambition to deliver consistently excellent services which are continuously improving.
4.58% of attendances this
month left the department before
being seen4% of attendances this month were unplanned re-attendances"
Data will be available from an Audit facilitated by the Royal
College of Emergency Medicine. The date of the audit has yet to
be confirmed.
44.82 % of patients with cellulitus and 82.35% of patients with deep vein
thrombosis attending A/E are admitted to hospital.
95% of patients waited under 43 minutes from arrival to full initial assessment which includes all vital signs for patients arriving by ambulance
On average, patients waited 74 minutes
from arrival to see a Doctor or Nurse
Practitioner
95% of patients waited under 240
minutes from arrival to departure
95% of patients not requiring admission to hospital waited under 229 minutes from
arrival to departure
Patient arrives at
Left without being seen (<5%)
Re-attendance (<5%)
Initial Assessment (<15mins) Treatment (<60mins)
Total time in A&E ( <240mins )
Legend
Successfully meets performance threshold
Does not meet threshold
Feedback from patients, carers and staff relating to experience is important to improve the service. Monthly surveys are undertaken in the department and the results are shared with the public. The results for March show that 80% of patients say that they were treated with privacy and dignity when being examined or treated. and 88% of patients say that they had their health problems explained to them in a way that they understood.
Consultant Sign-off
Ambulatory Care
Service Experience
In Conclusion
• There are a lot of indicators out there• Ultimate choice depends on whether they meet criteria
for good indicators• National indicators for NHSOF and CCGOIS – assured
and tested• Local indicator development based on local priorities• Consider triggers and alerts• Uses for Board reporting and assurance• Dashboards can be used to support delivery of safe and
effective care – but only if they are well designed• Integrating local data flows – instantaneous reporting