Understanding our graphs - Bureau of Health...
Transcript of Understanding our graphs - Bureau of Health...
Back to menu
Understanding our graphs
The Bureau of Health Information uses a range of graphs to present key research findings in a simple and visually engaging way.
Back to menu
0
10
20
30
40
50
0 5 10 20 25 30
Cum
ulat
ive
mor
talit
y (%
)
15
Days since hospitalisation
Queanbeyan Health Service (low mortality) NSW Parkes District Hospital (high mortality)
Below are some of the most common graphs BHI uses. Advice on how to read and interpret each of these can be found on the following pages.
Lozenge graphCumulative mortality graph
String of pearls graph
Circuit graph
Dot plot
Mountain graph
Double axis line graph Battenberg graph Trend graph Scatter plot
Zig-zag graph
Funnel plot
18bhi.nsw.gov.auThe Insights Series – Exploring clinical variation in mortality, NSW, July 2012 – June 2015
Figure 10 Hospitals with changed outlier status, 30-day mortality, NSW, 2009–12 and 2012–15
09–12 12–15 AMIIschaemic stroke
Haemorrhagic stroke CHF Pneumonia COPD
Hip fracture surgery
John Hunter Bowral
Tamworth
Inverell
Manning
Wyong
Cowra
Port Macquarie
Tamworth
Tamworth
Calvary Mater
Dubbo
Nepean
Parkes
Armidale
Auburn
Shoalhaven
Westmead
Gosford Armidale
Ballina
Belmont
Campbelltown
Grafton
Maclean
Moruya
Parkes*
Blue Mountains
Gosford
Moruya
Mudgee
Port Macquarie
Queanbeyan
Belmont
Calvary Mater
Campbelltown
Casino
John Hunter
Manning
Mudgee
The Tweed
Dubbo
John Hunter
Manning
Port Macquarie
Cessnock
Hornsby
Milton
Tamworth
Dubbo
Lismore
Nepean
Tamworth
John Hunter
Port Macquarie
Batemans Bay
Blue Mountains
Cowra
Manning
Mudgee
Port Macquarie
Tumut
Tamworth Mt Druitt Gosford
Orange
Blacktown
Kempsey
Queanbeyan*
Belmont Blacktown
Concord
Hornsby
St Vincent’s
Prince of Wales
Royal North Shore
Wollongong
Royal North Shore
Wollongong
Concord
Liverpool
Prince of Wales Prince of Wales Shellharbour
St Vincent’s
St Vincent’s
Changes between 2009–12 and 2012–15
Three hospitals had consistently low mortality for the same condition in both 2009–12 and 2012–15 and two of them (Prince of Wales and St Vincent’s) did so for two conditions. There were eight hospitals with higher than expected mortality for the same condition across both time periods, and one of these (Tamworth) did so for three conditions (Figure 10).
For 10 hospitals, mortality improved to lower than expected – and for four of them, improvement was for two conditions (Blacktown, Concord, Royal North Shore and Wollongong). One hospital changed from higher to lower than expected mortality – Blacktown for pneumonia.
For 18 hospitals, mortality improved to no different than expected for at least one condition; and for Tamworth and Port Macquarie, the improvement was for three and two conditions, respectively (Figure 10).
No different than expectedHigher than expected Lower than expected
Note: Using 90% control limits in 2009–12, eight hospitals had higher than expected mortality: Belmont (COPD), Blacktown(COPD), Bowral (AMI), Coffs Harbour (hip and ischaemic stroke), Royal Prince Alfred (ischaemic stroke), St. George (AMI), Westmead (ischaemic stroke). One hospital had lower than expected mortality: Belmont (ischaemic stroke).
* <50 hospitalisations in 2009–12 DRAFT COPY 10/04/17 DO NOT CIRCULATE
Figure 15 Acute myocardial infarction 30-day risk-standardised mortality ratio (hospitals with ≥ 50 patients), by peer group, July 2012 – June 2015
Figure 16 Acute myocardial infarction, 15-year time series results for hospitals that were outliers for the period July 2012 – June 2015
* RSMR outliers in July 2012 – June 2015 used control limits of 95% and 99.8%. Periods between July 2000 and June 2012 used control limits of 90% and 95%. Historical results that were outside the 90% control limits but did not reach significance at the 95% level are categorised as ‘intermediate’ results.
Higher than expected Lower than expectedNo different than expected
No different than expected <50 cases (not reported)Statistically significant result Intermediate* result
Hospitals with lower than expected mortality in July 2012 – June 2015
Hospitals with higher than expected mortality in July 2012 – June 2015
Kempsey Calvary Mater Parkes
Prince of Wales Dubbo
Queanbeyan Nepean
00
–0
3
03
–0
6
06
–0
9
09
–1
2
12
–1
5
00
–0
3
03
–0
6
06
–0
9
09
–1
2
12
–1
5
00
–0
3
03
–0
6
06
–0
9
09
–1
2
12
–1
5
District group 2(Peer group C2)
District group 1(Peer group C1)
Major(Peer group B)
Principal referral(Peer group A)
Numberof patients
344,31305,01136,2038,1
NSW
0.0
0.5
1.0
1.5
2.0
2.5 Higher than expected Lower than expected
RS
MR
(obs
erve
d/ex
pect
ed)
24bhi.nsw.gov.auThe Insights Series – Exploring clinical variation in mortality, NSW, July 2012 – June 2015DRAFT COPY 10/04/17 DO NOT CIRCULATE
25 The Insights Series – Exploring clinical variation in readmission, NSW, July 2012 – June 2015 bhi.nsw.gov.au
Detailed information about when in the 30-day period following discharge readmissions occur, and the reasons for those readmissions, can highlight potential areas for improvement.25 A high number of readmissions within seven days of discharge may, for example, point to problems with discharge planning.
About 41% of returns to acute care occurred in the seven days following discharge and about half of these were for AMI or related conditions (Figure 17). Among the returns that were potentially related to hospital care, pneumonia was the most frequent cause (Table 3).
Some studies have found a relationship between length of stay and the likelihood of readmission.26,27 Lengths of stay that are too short may result in patients being discharged before their recovery is properly established and their condition stabilised, leading to an unplanned return to acute care.
Conversely, lengths of stay that are too long carry an increased risk of hospital-acquired complications such as infections. The unadjusted readmission
rate following hospitalisation for AMI increased with increasing lengths of stay in the index hospitalisation, up to 22% following stays of 15+ days (Figure 18).
Categorising the reasons for readmission that occurred after short (1–2 days), medium (3–7 days) or long (8+ days) hospitalisations reveals that as length of stay increased, there was a greater proportion of returns that were potentially related to hospital care (Figure 19).
Acute myocardial infarctionExploring patterns of readmission
Figure 17 Acute myocardial infarction, number of, and reasons for readmission, day 1–30 post discharge, NSW public hospitals, July 2012 – June 2015
0
50
100
150
200
250
300
350
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30
Num
ber o
f ret
urns
to a
cute
car
e
Days post discharge
6. Other conditions
5. Potentially related to hospital care (time sensitive, 8-30 days post discharge)
4. Potentially related to hospital care (time sensitive, ≤ 7 days post discharge)
3. Potentially related to hospital care (relevant at any time)
2. Condition related to principal diagnosis
1. Same principal diagnosis
Category Reason for readmission (n)
3. Potentially related to hospital care (relevant at any time)
Gastrointestinal haemorrhage (40)
Acute kidney failure (33)
Pulmonary embolism without mention of cor pulmonale (32)
4. Potentially related to hospital care (time sensitive, ≤ 7 days post discharge)
Pneumonia (48)
COPD with acute lower respiratory infection (26)
Syncope and collapse (21)
Urinary tract infection (21)
Table 3 Readmission, top three reasons in categories potentially related to hospital care
DRAFT COPY 10/04/17 DO NOT CIRCULATE
Obs
erve
d (u
nadj
uste
d) ra
teE
xpec
ted
rate
Ris
k–st
anda
rdis
ed m
orta
lity
ratio
(RS
MR
)
Statistically significant result
0.0
0.2
0.4
0.6
0.8
1.0
1.2
1.4
1.6
1.8
2.0
0
5
10
15
20
25
July 00 – June 03 July 03 – June 06 July 06 – June 09 July 09 – June 12 July 12 – June 15
Compared to NSW:
Higher than expectedNo different than expectedLower than expected Chronic
RSMR <50 cases obstructiveRSRR Acute myocardial Ischaemic Haemorrhagic Congestive pulmonary Hip fracture Total hip Total knee
Not applicable infarction stroke stroke heart failure Pneumonia disease surgery replacement replacement
Armidale and New England Hospital
Auburn Hospital
Ballina District Hospital
Bankstown / Lidcombe Hospital
Bateman's Bay District Hospital
Bathurst Base Hospital
Bellinger River District Hospital
Belmont Hospital
Blacktown Hospital
Blue Mountains District ANZAC Memorial Hospital
Bowral and District Hospital
Broken Hill Base Hospital
Calvary Mater Newcastle
Camden Hospital
Campbelltown Hospital
Canterbury Hospital
Casino and District Memorial Hospital
Cessnock District Hospital
Coffs Harbour Base Hospital
Concord Hospital
Cooma Health Service
Cowra District Hospital
Deniliquin Health Service
Dubbo Base Hospital
Fairfield Hospital
Forbes District Hospital
Gosford Hospital
Goulburn Base Hospital
Grafton Base Hospital
Griffith Base Hospital
Gunnedah District Hospital
Hornsby and Ku-Ring-Gai Hospital
Inverell District Hospital
John Hunter Hospital
Kempsey Hospital
Kurri Kurri District Hospital
Lismore Base Hospital
Lithgow Health Service
Liverpool Hospital
Macksville District Hospital
2009
– 2
012
2012
– 2
015
Trend graphs show a general pattern of results over time.
This example shows changes in the number of, and way in which, people leave the emergency department (ED). It compares results from the end of 2010 through to the end of 2015.
The different coloured lines show the different modes of separation i.e. how a patient left the ED.
What this graph is telling me
• The most common way in which a patient leaves the ED is to be treated and discharged home, followed by being treated and admitted to a hospital ward
• Compared to the last three months of 2010, results for the same period in 2015 show an increase in the number of patients who were treated and discharged, treated and admitted to hospital, and transferred to another hospital
• There was a decrease in the number of patients who left without, or before completing, treatment over the five years.
UNDERSTANDING OUR GRAPHS
Trend graph
332,169 344,116358,129
391,860
419,013 424,314
141,226 149,400 156,443167,773 176,037 180,228
45,438 41,785 37,413 34,574 34,102 35,709
9,525 9,654 10,969 12,147 12,894 13,620
0
100,000
200,000
300,000
400,000
500,000
Oct
-Dec
Jan-
Mar
Apr
-Jun
Jul-S
ep
Oct
-Dec
Jan-
Mar
Apr
-Jun
Jul-S
ep
Oct
-Dec
Jan-
Mar
Apr
-Jun
Jul-S
ep
Oct
-Dec
Jan-
Mar
Apr
-Jun
Jul-S
ep
Oct
-Dec
Jan-
Mar
Apr
-Jun
Jul-S
ep
Oct
-Dec
2010 2011 2012 2013 2014 2015
Treated anddischarged
Treated and admittedto hospital
Patient left without, or beforecompleting, treatment
Transferred toanother hospital
Num
ber
of p
rese
ntat
ions
Source: Hospital Quarterly, Activity and performance in NSW public hospitals, October to December
UNDERSTANDING OUR GRAPHS
Scatter graph
0
10
20
30
40
50
60
70
80
90
100
-20 -15 -10 -5 0 5 10 15 20
Peer group A1 hospital Peer group A2 hospital Peer group A3 hospital Peer group B hospitalPeer group C1 hospital Peer group C2 hospital NSW
Pat
ient
s le
avin
g th
e E
D w
ithin
four
hou
rs (%
)
Change compared to same quarter last year (percentage points)
Decrease Increase
Hig
her
than
NS
WLo
wer
than
NS
W
Casino
Auburn
Manning
Lismore
Campbelltown
Wollongong Childrens Hospital Westmead
Orange Bathurst
Macksville Canterbury
Mount Druitt Gosford
Broken Hill
Maitland
Bowral
Blacktown
Westmead
BHI uses scatter plots to show an overall pattern of performance, as well as individual hospital results, and to compare results over time.
In this example, each different shape represents a different sized hospital (i.e. peer group) and the blue line represents the overall result for NSW. Hospitals are positioned on the scatter plot according to the proportion of their patients who left the emergency department (ED) within four hours (the vertical or y-axis) and the change for this measure since the same period in the previous year (the horizontal or x-axis).
What this graph is telling me
• Hospitals to the left of the blue vertical line had a lower proportion of patients leaving the ED within four hours compared to the same period in the previous year
• Hospitals in the top right section had the best performance overall – they achieved better results than what is typically seen across NSW for the percentage of patients leaving the ED within four hours and they improved their result compared to the same time the previous year
• Hospitals in the lower left section had worse performance in this quarter – these hospitals had a decrease in the percentage of patients leaving the ED within four hours, and their performance
Source: Hospital Quarterly, Activity and performance in NSW public hospitals, October to December
UNDERSTANDING OUR GRAPHS
Dot plot
Significantlyhigher than NSW
No significantdifference
Significantlylower than NSW
Hospital result, relative to NSW: NSW
1009080706050403020100
% of patients
49%
ED staff 'definitely' told patient when to resume usual activities
Staff 'completely' took family or home situation into account when planning discharge
Received understandable explanations fromhealth professionals'all of the time'
58%
56%
BHI uses dot plots to show the range of results across different hospitals or local health districts and how many achieved the same result.
This example shows patient opinion across NSW hospitals about the responsiveness of staff to their needs while they were in the emergency department. Each circle represents a different hospital and the blue line represents the overall result for NSW.
Green circles represent a result that is statistically more positive than the NSW result; red circles show statistically lower results than the NSW result. The term ‘statistically’ means the difference was not likely to be just by chance.
Circles stacked on top of each other show the number of hospitals that achieved the same result.
What this graph is telling me
• For five hospitals, 60% of patients said staff ‘completely’ took their family or home situation into account when planning discharge
• Across NSW, the results ranged from just under 40% to just under 90%. Both of these results were significantly different from NSW.
Source: Snapshot Report: Patient Perspectives: Exploring aspects of integration for hospital patients, Volume 2, May 2015
UNDERSTANDING OUR GRAPHS
Z-shaped graphs
0%
20%
40%
60%
80%
100%
3+ admissions 2 admissions
1 admission 0 admissions
1.2 million admissions
407,584
277,676
550,471
2% of people,22% of admissions
7% of people,45% of admissions
138,83896,784
6.7 million (90%) were not
hospitalised
550,471
7.5 millionNSW population
293,420200,015
7.6 million NSW population
2.5 million ED visits
6.1 million(80%) did not
visit an ED
1,012,756
854,326
586,840
1,012,756
3+ ED visits 2 ED visits
1 ED visit 0 ED visits
3% of peopleaccountedfor 35% of visits
4% of people,24% of visits
13% of people,41% of visits
1% of peopleaccounted for33% of admissions
BHI uses this style of graph to show how many and how often patients visit an emergency department (ED), hospital or other healthcare facility.
In this example, the first bar shows the population of NSW for the year 2014—15, broken down into four categories; those who never visited an ED during that year, those who visited the ED once, twice, or more than three times.
Of only the people who visited the ED during the specified year, the second bar shows the percentage of all visits to the ED that each category accounted for.
What this graph is telling me
• During the specified year, the darkest blue colour shows that 200,015 people visited an emergency department three or more times and accounted for 854,326 (about a third) of all emergency department visits in NSW
• Therefore, a small percentage of the population accounted for a high level of ED use.
Source: Annual Performance Report: Healthcare in Focus 2016
UNDERSTANDING OUR GRAPHS
Circuit graph
8.5
9.4
9.410.2 10.4
11.1 11.1 11.3 11.6 11.8
17.1
9.4
0
2
4
6
8
10
12
14
16
18
Unite
dKi
ngdo
m
Norw
ay(2
011)
Aust
ralia
NSW
New
Zea
land
(201
2)
Cana
da
Fran
ce
Germ
any
Swed
en
Neth
erla
nds
Switz
erla
nd
Unite
dSt
ates
Hea
lthca
re e
xpen
ditu
re
as a
per
cent
age
of G
DP
/GS
P
1.4% -0.5% 1.1% 1.4% 2.2% 1.2% 0.8% 0.4% 2.4% 2.5% 0.7% 1.9%
20032013
Percentage point difference
Circuit graphs are used to compare the gap between two results.
This example shows comparative spending on healthcare as a percentage of Gross Domestic (State) Product between NSW and comparator countries.
What this graph is telling me
• The proportion of NSW’s wealth dedicated to healthcare spending increased between 2003 and 2013 from 8.1% to 9.4% of gross state product.
• This is similar to other countries like the United Kingdom and Canada, as shown by the line between the un-shaded and shaded circles.
Source: Annual Performance Report: Healthcare in Focus 2016
22bhi.nsw.gov.auThe Insights Series – Exploring clinical variation in mortality, NSW, July 2012 – June 2015
Figure 12 Acute myocardial infarction 30-day risk-standardised mortality ratio, NSW public hospitals, July 2012 – June 2015
Figure 13 The effect of statistical adjustment on measures of acute myocardial infarction 30-day mortality, NSW public hospitals (peer groups A – C), July 2012 – June 2015
Unadjusted ratiosranged from 0.13 to 1.91
Range ofhospital results
Age and sex standardised ratios
ranged from 0.13 to 1.95
Risk-standardised mortality ratios
ranged from 0.15 to 2.20
Increasing account taken of patient characteristics
Unadjusted Age and sex standardised RSMR
Lower than expected Higher than expected
Outlier for all models Queanbeyan Calvary Mater
Outlier for age- and sex-standardised and risk-standardised Kempsey Nepean
Outlier for unadjusted and risk-standardised - -
Outlier for risk-standardised only Prince of Wales Dubbo, Parkes
QueanbeyanKempsey
Prince of Wales
Parkes
Dubbo Calvary
Nepean
0.0
1.0
2.0
3.0
4.0
002051001050
RS
MR
(o
bser
ved/
expe
cted
)
Expected number of deaths within 30 days
99.8% limits
Higher than expected
95% limits
Lower than expectedNo different than expected
DRAFT COPY 10/04/17 DO NOT CIRCULATE
Back to menu
A lozenge graph is used to compare a unit’s results over time.
This example shows the hospitals with a changed or consistent outlier status over the two most recent reporting periods (2009–12 and 2012–15) with regards to mortality for seven different conditions.
The rows show those hospitals with consistent higher or lower than expected mortality results, as well as hospitals that improved or deteriorated between the two time periods.
UNDERSTANDING OUR GRAPHS
Lozenge graph
Source: The Insights Series – Exploring clinical variation in mortality, NSW, July 2012 – June 2015
What is this graph telling me?
There were eight hospitals with higher than expected mortality for the same condition across both time periods (top row). One of these (Tamworth) did so for three conditions.
For 18 hospitals, mortality improved to ‘no different than expected’ for at least one condition; and for Tamworth and Port Macquarie, the improvement was for three and two conditions, respectively.
By showing the hospitals with a changed outlier status over two consecutive time periods, hospital performance can be compared over time.
Hospitals with changed status, 30-day mortality, NSW, 2009–12 and 2012–15
18bhi.nsw.gov.auThe Insights Series – Exploring clinical variation in mortality, NSW, July 2012 – June 2015
Figure 10 Hospitals with changed outlier status, 30-day mortality, NSW, 2009–12 and 2012–15
09–12 12–15 AMIIschaemic stroke
Haemorrhagic stroke CHF Pneumonia COPD
Hip fracture surgery
John Hunter Bowral
Tamworth
Inverell
Manning
Wyong
Cowra
Port Macquarie
Tamworth
Tamworth
Calvary Mater
Dubbo
Nepean
Parkes
Armidale
Auburn
Shoalhaven
Westmead
Gosford Armidale
Ballina
Belmont
Campbelltown
Grafton
Maclean
Moruya
Parkes*
Blue Mountains
Gosford
Moruya
Mudgee
Port Macquarie
Queanbeyan
Belmont
Calvary Mater
Campbelltown
Casino
John Hunter
Manning
Mudgee
The Tweed
Dubbo
John Hunter
Manning
Port Macquarie
Cessnock
Hornsby
Milton
Tamworth
Dubbo
Lismore
Nepean
Tamworth
John Hunter
Port Macquarie
Batemans Bay
Blue Mountains
Cowra
Manning
Mudgee
Port Macquarie
Tumut
Tamworth Mt Druitt Gosford
Orange
Blacktown
Kempsey
Queanbeyan*
Belmont Blacktown
Concord
Hornsby
St Vincent’s
Prince of Wales
Royal North Shore
Wollongong
Royal North Shore
Wollongong
Concord
Liverpool
Prince of Wales Prince of Wales Shellharbour
St Vincent’s
St Vincent’s
Changes between 2009–12 and 2012–15
Three hospitals had consistently low mortality for the same condition in both 2009–12 and 2012–15 and two of them (Prince of Wales and St Vincent’s) did so for two conditions. There were eight hospitals with higher than expected mortality for the same condition across both time periods, and one of these (Tamworth) did so for three conditions (Figure 10).
For 10 hospitals, mortality improved to lower than expected – and for four of them, improvement was for two conditions (Blacktown, Concord, Royal North Shore and Wollongong). One hospital changed from higher to lower than expected mortality – Blacktown for pneumonia.
For 18 hospitals, mortality improved to no different than expected for at least one condition; and for Tamworth and Port Macquarie, the improvement was for three and two conditions, respectively (Figure 10).
No different than expectedHigher than expected Lower than expected
Note: Using 90% control limits in 2009–12, eight hospitals had higher than expected mortality: Belmont (COPD), Blacktown(COPD), Bowral (AMI), Coffs Harbour (hip and ischaemic stroke), Royal Prince Alfred (ischaemic stroke), St. George (AMI), Westmead (ischaemic stroke). One hospital had lower than expected mortality: Belmont (ischaemic stroke).
* <50 hospitalisations in 2009–12 DRAFT COPY 10/04/17 DO NOT CIRCULATE
Back to menu
Cumulative graphs provide information about an indicator over a period of time.
In this example the cumulative graph shows the way patient deaths are distributed over the 30-day period following hospitalisation.
The example shows the percentage of deaths that occurred by day, following hospitalisation.
Results for two example hospitals are shown – Queanbeyan (green line) and Parkes (red line) as well as for NSW overall (blue line).
Each death results in a step increase in the cumulative mortality line.
UNDERSTANDING OUR GRAPHS
Cumulative mortality graph
Source: The Insights Series – Exploring clinical variation in mortality, NSW, July 2012 – June 2015
What is this graph telling me?
Compared with the NSW cumulative mortality profile, mortality among patients hospitalised at Parkes increased more sharply around day 10; while for patients at Queanbeyan, fewer deaths over the 30-day period are reflected in a much flatter curve.
Comparing individual hospital results with NSW puts their performance in context – showing where NSW stands and where to look to guide improvement.
Cumulative mortality, by day, acute myocardial infarction, NSW and highest and lowest RSMR hospitals, July 2012 – June 2015
0
10
20
30
40
50
0 5 10 20 25 30
Cum
ulat
ive
mor
talit
y (%
)
15
Days since hospitalisation
Queanbeyan Health Service (low mortality) NSW Parkes District Hospital (high mortality)
Back to menu
A ‘string of pearls’ graph is used to show the distribution of unit (often hospital or local health district) results and highlight differences from the NSW result.
This example shows a series of string of pearl graphs for individual hospital’s risk-standardised mortality ratio (RSMR), by peer group.
Each circle shows a hospital’s RSMR and highlights whether it is higher than expected, no different than expected or lower than expected, compared to the NSW result (shown as a blue line).
Note: String of pearls graphs can be shown vertically and horizontally.
UNDERSTANDING OUR GRAPHS
String of pearls graph
Source: The Insights Series – Exploring clinical variation in mortality, NSW, July 2012 – June 2015
What is this graph telling me?
Across peer groups, higher and lower than expected mortality occurred in principal referral and district hospitals. Among smaller district hospitals (peer group C2), there was one hospital with higher than expected mortality (coloured red) and two hospitals with lower than expected mortality (coloured green).
Hospital peer groups are used to cluster similar hospitals together so that fair comparisons can be made.
Acute myocardial infarction 30-day risk-standardised mortality ratio, by peer group, July 2012 – June 2015Figure 15 Acute myocardial infarction 30-day risk-standardised mortality ratio (hospitals with ≥ 50
patients), by peer group, July 2012 – June 2015
Figure 16 Acute myocardial infarction, 15-year time series results for hospitals that were outliers for the period July 2012 – June 2015
* RSMR outliers in July 2012 – June 2015 used control limits of 95% and 99.8%. Periods between July 2000 and June 2012 used control limits of 90% and 95%. Historical results that were outside the 90% control limits but did not reach significance at the 95% level are categorised as ‘intermediate’ results.
Higher than expected Lower than expectedNo different than expected
No different than expected <50 cases (not reported)Statistically significant result Intermediate* result
Hospitals with lower than expected mortality in July 2012 – June 2015
Hospitals with higher than expected mortality in July 2012 – June 2015
Kempsey Calvary Mater Parkes
Prince of Wales Dubbo
Queanbeyan Nepean
00
–0
3
03
–0
6
06
–0
9
09
–1
2
12
–1
5
00
–0
3
03
–0
6
06
–0
9
09
–1
2
12
–1
5
00
–0
3
03
–0
6
06
–0
9
09
–1
2
12
–1
5
District group 2(Peer group C2)
District group 1(Peer group C1)
Major(Peer group B)
Principal referral(Peer group A)
Numberof patients
344,31305,01136,2038,1
NSW
0.0
0.5
1.0
1.5
2.0
2.5 Higher than expected Lower than expected
RS
MR
(obs
erve
d/ex
pect
ed)
24bhi.nsw.gov.auThe Insights Series – Exploring clinical variation in mortality, NSW, July 2012 – June 2015DRAFT COPY 10/04/17 DO NOT CIRCULATE
Back to menu
Funnel plots are used to help interpret whether differences in unit (often hospital) results are significant, taking into account the number of patients seen in the hospital.
Mortality is influenced by a wide range of factors, meaning there will always be some level of variation in patient outcomes. The ‘funnel’ shape used here indicates the tolerance around this variability.
Hospitals with fewer hospitalisations (with a relatively low expected number of deaths, and appearing towards the left hand side of the plot) will display greater variability and may have a high or low ratio by chance. Fair assessment about performance should take this into account.
Hospitals above the upper 95% control limit of the funnel are considered to have higher than expected mortality ratios; those below the lower 95% limit are considered to have lower than expected RSMRs. For hospitals outside 99.8% limits, there is greater confidence about their outlier status.
UNDERSTANDING OUR GRAPHS
Funnel plot
Source: The Insights Series – Exploring clinical variation in mortality, NSW, July 2012 – June 2015
What is this graph telling me?
This funnel plot shows 30-day RSMRs for each hospital in NSW. Of the 67 hospitals that admitted 50 or more AMI patients in the three year period, there were three (Queanbeyan, Kempsey, and Prince of Wales) with lower than expected mortality and five (including Parkes, Dubbo, Calvary Mater and Nepean) with higher than expected mortality.
Patient outcomes always vary across hospitals. Risk-standardisation helps make fair comparisons by taking into account patient characteristics that may influence outcomes, regardless of the care provided (e.g. age, and other co-existing diseases (comorbidities)).
Acute myocardial infarction 30-day risk-standardised mortality ratio, NSW public hospitals, July 2012 – June 2015
22bhi.nsw.gov.auThe Insights Series – Exploring clinical variation in mortality, NSW, July 2012 – June 2015
Figure 12 Acute myocardial infarction 30-day risk-standardised mortality ratio, NSW public hospitals, July 2012 – June 2015
Figure 13 The effect of statistical adjustment on measures of acute myocardial infarction 30-day mortality, NSW public hospitals (peer groups A – C), July 2012 – June 2015
Unadjusted ratiosranged from 0.13 to 1.91
Range ofhospital results
Age and sex standardised ratios
ranged from 0.13 to 1.95
Risk-standardised mortality ratios
ranged from 0.15 to 2.20
Increasing account taken of patient characteristics
Unadjusted Age and sex standardised RSMR
Lower than expected Higher than expected
Outlier for all models Queanbeyan Calvary Mater
Outlier for age- and sex-standardised and risk-standardised Kempsey Nepean
Outlier for unadjusted and risk-standardised - -
Outlier for risk-standardised only Prince of Wales Dubbo, Parkes
QueanbeyanKempsey
Prince of Wales
Parkes
Dubbo Calvary
Nepean
0.0
1.0
2.0
3.0
4.0
002051001050
RS
MR
(o
bser
ved/
expe
cted
)
Expected number of deaths within 30 days
99.8% limits
Higher than expected
95% limits
Lower than expectedNo different than expected
DRAFT COPY 10/04/17 DO NOT CIRCULATE
Back to menu
Mountain graphs show how the volume and type of outcome changes over time.
This example shows for AMI patients, the number of readmissions (or returns to acute care) by the number of days from the time patients were discharged from hospital.
The readmissions are separated (stratified) into groups according to the main reason for readmission.
These categories show whether the readmission was for:
• The same, or a condition related to the principal diagnosis (categories 1 and 2)
• For a complication or other issues related to hospital care (categories 3, 4 or 5)
• For an unrelated reason.
UNDERSTANDING OUR GRAPHS
Mountain graph
Source: The Insights Series – Exploring clinical variation in readmission, NSW, July 2012 – June 2015
What is this graph telling me?
Readmissions occurred more frequently soon after discharge. On the third day after discharge following AMI hospitalisation there were about 300 readmissions, about 60 of these were for the same principal diagnosis (i.e. AMI) and about 115 were for a condition related to the principal diagnosis.
Understanding reasons for readmission can tell us about the role hospital care plays in potentially avoidable readmissions.
Acute myocardial infarction: number of, and reasons for readmission, day 1–30 post discharge, NSW public hospitals, July 2012 – June 2015
0
50
100
150
200
250
300
350
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30
Num
ber
of r
etur
ns to
acu
te c
are
Days post discharge
6. Other conditions
5. Potentially related to hospital care (time sensitive, 8-30 days post discharge)
4. Potentially related to hospital care (time sensitive, ≤ 7 days post discharge)
3. Potentially related to hospital care (relevant at any time)
2. Condition related to principal diagnosis
1. Same principal diagnosis
Back to menu
UNDERSTANDING OUR GRAPHS
Double axis line graphThis graph explores for a particular hospital, whether changes in its risk-standardised mortality ratio (RSMR) over time is a result of changes in case mix or changes in observed mortality.
This example shows a hospital’s results for three different measures for mortality, using two axes, by five three year time periods.
Using the left axis, the graph shows:
1. How many patients died in or out of hospital within 30 days of admission (the observed rate, shown as a grey line) and
2. How many deaths were expected within 30 days of admission, given the characteristics of the hospital patients (the expected rate, shown as a dashed line).
Using the right axis, the graph shows the RSMR (shown as a coloured line).
RSMRs that are significantly different from the NSW expected mortality, for that period, are highlighted by a coloured square. Each line shows results over time for each of these three measures.
Source: The Insights Series – Exploring clinical variation in mortality, Goulburn Base Hospital Performance Profile, July 2012 – June 2015
What is this graph telling me?
In this example, this hospital’s case mix has not changed substantially over the 15 year period – the dotted line showing expected mortality has trended down. The actual (observed) mortality rate has changed markedly however and the RSMR closely follows those changes in actual mortality.
For patients hospitalised with a principal diagnosis of pneumonia, the hospital had lower than expected mortality (green square) in July 00 – June 03 and higher than expected mortality (red square) in July 06 – June 09.
A statistically significant difference is one that we can be confident is a real difference, and is not only due to chance.
Pneumonia, this hospital’s risk-standardised mortality ratio, expected mortality rates and observed (unadjusted) mortality rates, July 2000 – June 2015
Obs
erve
d (u
nadj
uste
d) ra
teE
xpec
ted
rate
Ris
k–st
anda
rdis
ed m
orta
lity
ratio
(RS
MR
)
Statistically significant result
0.0
0.2
0.4
0.6
0.8
1.0
1.2
1.4
1.6
1.8
2.0
0
5
10
15
20
25
July 00 – June 03 July 03 – June 06 July 06 – June 09 July 09 – June 12 July 12 – June 15
Back to menu
Battenberg graphs provide an overview of hospital level results across mortality and readmission analyses for two time periods.
In this example, hospital level results are shown across nine conditions. Each Battenberg shows the significance of two measures. Across, it shows the risk-standardised mortality ratio (RSMR) and risk-standardised readmissions ratio (RSRR). Down, it shows how these measures change across the two consecutive three-year time periods (09-12 and 12-15).
Each square represents either an RSMR or RSRR. The colour of each cell shows the significance of that hospital’s result compared to the NSW result.
UNDERSTANDING OUR GRAPHS
Battenberg graph
Source: The Insights Series – Exploring clinical variation in mortality and readmission, An overview, July 2012 – June 2015
What is this graph telling me?
The overwhelming majority of hospital results were no different than expected (coloured grey), given their patient case mix.
For the mortality analyses*, there were 398 individual hospital results, and of those 45 were higher than expected (coloured red) and 20 were lower than expected (coloured green).
For the readmission analyses*, there were 435 individual hospital results, and of those 31 were higher than expected and 27 were lower than expected.
* Not all hospitals are shown in this example. Please refer to the original graph for details.
By showing different types of measures (here, RSMR and RSRR) across two consecutive time periods, hospital performance can be compared over time.
Overview of hospital results, UCV – An overview report
Compared to NSW:
Higher than expectedNo different than expectedLower than expected Chronic
RSMR <50 cases obstructiveRSRR Acute myocardial Ischaemic Haemorrhagic Congestive pulmonary Hip fracture Total hip Total knee
Not applicable infarction stroke stroke heart failure Pneumonia disease surgery replacement replacement
Armidale and New England Hospital
Auburn Hospital
Ballina District Hospital
Bankstown / Lidcombe Hospital
Bateman's Bay District Hospital
Bathurst Base Hospital
Bellinger River District Hospital
Belmont Hospital
Blacktown Hospital
Blue Mountains District ANZAC Memorial Hospital
Bowral and District Hospital
Broken Hill Base Hospital
Calvary Mater Newcastle
Camden Hospital
Campbelltown Hospital
Canterbury Hospital
Casino and District Memorial Hospital
Cessnock District Hospital
Coffs Harbour Base Hospital
Concord Hospital
Cooma Health Service
Cowra District Hospital
Deniliquin Health Service
Dubbo Base Hospital
Fairfield Hospital
Forbes District Hospital
Gosford Hospital
Goulburn Base Hospital
Grafton Base Hospital
Griffith Base Hospital
Gunnedah District Hospital
Hornsby and Ku-Ring-Gai Hospital
Inverell District Hospital
John Hunter Hospital
Kempsey Hospital
Kurri Kurri District Hospital
Lismore Base Hospital
Lithgow Health Service
Liverpool Hospital
Macksville District Hospital
2009
– 2
012
2012
– 2
015
Back to menu
Trend graphs show changes in a measure over time.
This example looks at the number of patients who visited an emergency department (ED) by three month periods (quarters).
The results are split (stratified) into four groups, by mode of separation. The mode of separation is a way of describing where patients went after they left an ED.
The graph shows how many patients:
• Were treated in ED and discharged home
• Were treated in the ED and then admitted to hospital
• Left the ED without receiving treatment or before completing treatment
• Were transferred to another hospital.
Each coloured line shows changes over time in the number of patients in each of these four groups.
UNDERSTANDING OUR GRAPHS
Trend graph
332,169 344,116358,129
391,860
419,013 424,314
141,226 149,400 156,443167,773 176,037 180,228
45,438 41,785 37,413 34,574 34,102 35,709
9,525 9,654 10,969 12,147 12,894 13,620
0
100,000
200,000
300,000
400,000
500,000
Oct
-Dec
Jan-
Mar
Apr
-Jun
Jul-S
ep
Oct
-Dec
Jan-
Mar
Apr
-Jun
Jul-S
ep
Oct
-Dec
Jan-
Mar
Apr
-Jun
Jul-S
ep
Oct
-Dec
Jan-
Mar
Apr
-Jun
Jul-S
ep
Oct
-Dec
Jan-
Mar
Apr
-Jun
Jul-S
ep
Oct
-Dec
2010 2011 2012 2013 2014 2015
Treated anddischarged
Treated and admittedto hospital
Patient left without, or beforecompleting, treatment
Transferred toanother hospital
Num
ber
of p
rese
ntat
ions
Source: Hospital Quarterly, Activity and performance in NSW public hospitals, October to December 2015
What is this graph telling me?
Among ED patients, most patients were treated and discharged.
In the October-December quarter of 2010, there were 332,169 visits for which patients were treated and discharged. This number increased to 424,314 in the same quarter in 2015.
Because patterns of patient visits to an ED are affected by seasons (for example, there are more visits in the winter) and holiday periods, it makes sense to compare the same ‘quarter’ each year.
Patients who left the emergency department, by mode of separation, October 2010 to December 2015
Back to menu
UNDERSTANDING OUR GRAPHS
Scatter plot
0
10
20
30
40
50
60
70
80
90
100
-20 -15 -10 -5 0 5 10 15 20
Peer group A1 hospital Peer group A2 hospital Peer group A3 hospital Peer group B hospitalPeer group C1 hospital Peer group C2 hospital NSW
Pat
ient
s le
avin
g th
e E
D w
ithin
four
hou
rs (%
)
Change compared to same quarter last year (percentage points)
Decrease Increase
Hig
her
than
NS
WLo
wer
than
NS
W
Casino
Auburn
Manning
Lismore
Campbelltown
Wollongong Childrens Hospital Westmead
Orange Bathurst
Macksville Canterbury
Mount Druitt Gosford
Broken Hill
Maitland
Bowral
Blacktown
Westmead
Scatter plot show results for units (often hospitals) with regards to two different measures
In this example, hospital results are shown for the percentage of patients who left the ED in four hours or less; and for the extent of change in that percentage since the same quarter, in the previous year.
Each shape represents a hospital and different shapes are used to denote peer groups. The dark blue horizontal line represents the NSW result overall for the quarter and the pale blue vertical line represents no change since last quarter.
For hospitals shown above the blue NSW line, a higher percentage of patients spent four hours or less in the ED, compared with the overall NSW result. For hospitals below this line, a lower percentage of patients spent four hours or less in the ED, compared with the overall NSW result. Hospitals shown to the left of the vertical ‘0’ line had lower results, compared with the same quarter last year, while those shown to the right of the vertical line had higher results.
Source: Hospital Quarterly, Activity and performance in NSW public hospitals, October to December 2015
What is this graph telling me?
Hospitals that are shown in the top right quadrant (or quarter) of the graph had relatively high percentage of their patients who spent four hours or less in the ED and improved on their own result in the same quarter in the previous year.
Hospitals that are shown in the bottom left quadrant of the graph had a relatively low percentage of their patients who spent four hours or less in the ED and are performing worse than in the previous year.
Percentage of patients who spent four hours or less in the emergency department, and percentage point change since same quarter last year, hospitals by peer group, October – December 2015
Hospital peer groups are used to cluster similar hospitals together so that fair comparisons can be made.
Back to menu
UNDERSTANDING OUR GRAPHS
Dot plot
Significantlyhigher than NSW
No significantdifference
Significantlylower than NSW
Hospital result, relative to NSW: NSW
1009080706050403020100
% of patients
49%
ED staff 'definitely' told patient when to resume usual activities
Staff 'completely' took family or home situation into account when planning discharge
Received understandable explanations fromhealth professionals'all of the time'
58%
56%
Dot plots show the distribution of results for units (often hospital) and highlight differences from the NSW result.
This example shows a series of dot plots for responses to three patient survey questions, by hospital.
Each plot shows the number of hospitals, by the percentage of their patients who gave the response ‘shown in inverted commas’ (usually this is the most positive response option – or top category).
Each circle shows a hospital’s result and highlights whether it is significantly different from the NSW result.
Source: Patient Perspectives: Exploring aspects of integration for hospital patients, Volume 2, May 2015
What is this graph telling me?
When patients were asked whether staff took their family or home situation into account when planning their discharge, 58% of NSW patients said yes,‘completely’. This percentage ranged across hospitals from 39% to 88%.
There were four hospitals with results significantly lower than NSW (coloured red); and four hospitals with results significantly higher than NSW (coloured green).
Comparisons of survey results are generally based on the top category – the proportion of patients who selected the most positive response option.
Responsiveness to ED patients’ needs and expectations, percentage selecting most positive option: hospital results relative to NSW
Back to menu
UNDERSTANDING OUR GRAPHS
Zig-zag graphs
0%
20%
40%
60%
80%
100%
3+ hospitalisations 2 hospitalisations
1 hospitalisation 0 hospitalisations
1.2 million admissions
407,584
277,676
550,471
2% of people,22% of hospitalisations
7% of people,45% of
hospitalisations138,83896,784
6.7 million (90%) were not
hospitalised
550,471
7.5 millionNSW population
293,420200,015
7.6 million NSW population
2.5 million ED visits
6.1 million(80%) did not
visit an ED
1,012,756
854,326
586,840
1,012,756
3+ ED visits 2 ED visits
1 ED visit 0 ED visits
3% of people,35% of visits
4% of people,24% of visits
13% of people,41% of visits
1% of people,33% ofhospitalisations
BHI uses this style of graph to show how many and how often patients visit an emergency department (ED), hospital or other healthcare facility.
This example shows the frequency of ED visits among NSW people (how often patients visited in a year).
The first bar shows the population of NSW for the year 2014—15, broken down into four categories; those who never visited an ED during that year, those who visited the ED once, twice, or more than three times.
The second bar shows the proportion of all ED visits made by patients in these four categories.
Source: Annual Performance Report: Healthcare in Focus 2015
What is this graph telling me?
A small percentage of the population accounted for a high level of ED use.
During the year, 200,015 people visited an emergency department three or more times and accounted for 854,326 (35%) of all ED visits in NSW.
There is an increasing number of patients with multiple diseases (or comorbidities) that require frequent or intensive use of healthcare. Comorbidities are important to measure and understand – both for risk adjustment of outcomes such as mortality and to guide assessment of coordination and continuity of care.
Frequency of ED visits (2014–15) NSW public hospitals
Back to menu
UNDERSTANDING OUR GRAPHS
Circuit graph
8.5
9.4
9.410.2 10.4
11.1 11.1 11.3 11.6 11.8
17.1
9.4
0
2
4
6
8
10
12
14
16
18
Unite
dKi
ngdo
m
Norw
ay(2
011)
Aust
ralia
NSW
New
Zea
land
(201
2)
Cana
da
Fran
ce
Germ
any
Swed
en
Neth
erla
nds
Switz
erla
nd
Unite
dSt
ates
Hea
lthca
re e
xpen
ditu
re
as a
per
cent
age
of G
DP
/GS
P
1.4% -0.5% 1.1% 1.4% 2.2% 1.2% 0.8% 0.4% 2.4% 2.5% 0.7% 1.9%
20032013
Percentage point difference
Circuit graphs are used to compare the gap between two measures or time points.
This example shows the percentage of Gross Domestic Product (GDP) or Gross State Product (GSP) spent on healthcare (public and private) in NSW and different comparator countries.
The ‘circuit line’ represents the gap between results for 2003 and 2014.
The bar underneath the graph shows the percentage point difference between the two time points.
Source: Annual Performance Report: Healthcare in Focus 2015
What is this graph telling me?
The proportion of GSP in NSW spent on healthcare (public and private) increased between 2003 and 2013 from 8.1% to 9.4%.
Countries such as the United Kingdom and Canada, had similar increases.
International comparisons put NSW performance in context – showing where NSW stands and where to look to guide improvement.
Total healthcare expenditure as a percentage of Gross Domestic (or State) Product, NSW and comparator countries, 2003 and 2013 (or nearest year)