Transcript of PROJECT NAME: Making both the EHR and Providers Smarter ...
Please complete all of the following sections. Submission is
limited to a maximum word count of 1500 (not including text in
graphs). Overview: Describe 1) where the work was completed (in
what type of department/unit); 2) the reason the change was needed;
3) what faculty/staff/patient groups were involved, and 4) the
alignment to organizational goals. Underuse of evidence-based care
for chronic disease management and prevention remains a major
quality problem. Standard build electronic health records(EHRs)
often lack intelligent decision support to measure and improve
quality. UT Southwestern Medical Center in Dallas (UTSW) has used
the outpatient Epic EHR for 8 years, but was unable to harness EHR
data for QI. The generic build of Epic is infamous for being an IT
‘roach motel—data goes in, but it doesn’t come out.’ We undertook
this UT Health System-funded IT QI grant at UTSW as a
multidisciplinary collaboration between all 4 adult primary care
practices in General Internal Medicine, Geriatrics and Family
Medicine, and our IT, Quality and Patient Safety, Health Services
Research, and Medical Informatics units. Medical informatics
experts at Northwestern Univ. with 6 years of experience optimizing
outpatient Epic, were also part of the project team. These 4
practices and 66 physicians care for 28,000 individuals and
manually-obtained quality reports suggested we lagged many national
quality benchmarks. The project aligned with institutional goals
to: 1) obtain NCQA Patient-Centered Medicine Home accreditation, 2)
benefit from pay for performance(P4P) quality incentive programs,
3) establish an outpatient quality measurement program, and 4) gain
local IT expertise in customizing Epic to improve quality,
efficiency, implement intelligent decision support, and meet
meaningful use criteria. Aim Statement (max points 150): Describe
the problem that you sought to address. Our aim was to customize
the EHR to: 1) Create a provider-friendly, exception reporting
system whereby clinicians can efficiently document medical or
patient reasons why quality measures may not apply to a given
patient to improve the accuracy of quality indicators; 2) Implement
new, customized Best Practice Alerts(BPAs) in Epic to serve as a
user-friendly, one-stop-shopping, low-hassle functionality that
identify any quality deficits in real-time so a provider can act on
them during patient visit as part of routine
PROJECT NAME: Making both the EHR and Providers Smarter: Optimizing
the Epic EHR to create accurate ambulatory quality measures and
user-friendly Best Practice Alerts to drive quality
improvement.
Institution: UT Southwestern
Primary Author: Ethan Halm Secondary Author: Jason Fish, Deepa Bhat
Other Authors: Temple Howell-Stampley, Lynne Kirk, Brett Moran,
Manjula Cherukuri, Kim Batchelor, Heather Schneider Project
Category: Effectiveness
Choose most appropriate category: 1) Patient Safety, 2) Patient
Centered Care, 3) Timeliness,
work flow; and 3) assess the impact of a multifactorial audit and
feedback and EHR- based decision support intervention on improving
our performance on national quality indicators. We focused on 19
high priority quality measures across chronic diseases (diabetes,
heart disease) and preventative services(cancer/osteoporosis
screening, immunization) showing suboptimal performance. We
implemented this across all 4 adult primary care practices staffed
by 66 physicians. Measures of Success: How did you measure the
impact of your proposed change? Our primary outcome focuses on the
19 high priority chronic disease and prevention measures,
specifically whether the quality measure was satisfied by
completion of the recommended service or therapy or an appropriate
exception was documented using the BPA. We tested for intervention
effects by fitting logistic regression models of measure completion
with a fixed, 2-level time factor(pre- and post-intervention) and
random effects to account for repeated measures within patients. We
interpreted statistically significant improvements(P<0.05) over
time as evidence of an intervention effect. All tests were
performed using SAS software. As secondary outcomes, we are
examining how often the different components of the Epic
enhancements(exception documentation, BPAs) were triggered, used,
ignored, or partly used as a way of assessing adoption, familiarity
and engagement with the EHR. Use of Quality Tools (max points 250):
What quality tools did you use to identify and monitor progress and
solve the problem? Provide sample QI tools, such as fishbone
diagram or process map. We have used several complementary QI
tools. One was the creation of standardized, normed audit and
feedback quality reports. These reports include provider-specific
numeric performance, clinic averages, and NCQA HEDIS targets, as
well as visual gestalt comparisons to national
benchmarks(green-yellow-red visual performance; See Attachment 1).
In addition, we have practice-level pre- and post-BPA
implementation reports used to refine the BPAs and practice
workflow(Attachment 2). During the planning phase, the
multidisciplinary team met several times brainstorming on
strategies for evaluating the current state, development of the
BPAs, and process improvement strategies. Examples of Fishbone
diagrams, Pareto charts (Attachment 3), and Process Maps used to
assess reasons for missing labs in patients with diabetes are
attached (Attachment 3). In the implementation phase, Decision Tree
analysis (Attachment 4) was done to understand exception reporting
trends and improve provider engagement. We also used PDSA cycling
to maintain and refine the BPA programming logic. Interventions
(max points 150 includes points for innovation): What was your
overall improvement plan? How did you implement the proposed
change? Who was involved in implementing the change? How did you
communicate the change to all key
stakeholders? What was the timeline for the change? Describe any
features you feel were especially innovative. The overall
improvement plan was to accurately measure, report and improve
performance on 19 chronic disease and preventive service national
quality measure. We developed, refined, programmed and implemented
BPAs for 19 measures over a 12 month period. Prior to and upon
implementation, our providers were trained on the rationale for and
use of the BPAs through a series of in-person provider meetings, as
well as email dissemination of a 4 minute voice over text narrated
video demonstrating proper BPA use using animated Epic screen shot
movies. Several innovative IT elements are worth noting. First, we
developed and refined computerized algorithms for case
identification(getting the ‘denominator’ of the quality measure
right) that pulled ICD-9 data from the: encounter, problem list,
medical/surgical history, and health care maintenance modules. This
approach was more sensitive at identifying eligible patients than
relying on one element(e.g. problem list). Second, to get the
‘numerator’ right, we did two things. We pulled lab, procedure, and
imaging results straight from the EHR. We also used data from the
new ‘exception reporting’ buttons we created for all BPAs since
clinicians often know things about patients not readily
ascertainable from structured electronic data. We created novel
exception buttons for the following scenarios: 1) patient did not
have the disease(false positive, past diagnosis of gestational
diabetes now resolved); 2) medical contraindication to recommended
care(symptomatic bradycardia on beta-blockers in MI patient); 3)
patient reason something wasn’t done(patient refusing vaccination);
or 4) test/procedure completed outside of UTSW(colonoscopy done by
community gastroenterologist). The system saved these data so
physicians did not keep getting ‘penalized’ for patients for whom
the measure should not apply or was satisfied by data the doctor,
but not the EHR, knew about. Third, BPAs were structured as
‘passive’ alerts not ‘hard stops’ that interrupt clinical work
flow. This was designed to prevent ‘pop-up fatigue.’ A single BPA
tab on the left of the screen was highlighted yellow if something
was due. The provider could click on that tab to see what was due
if/when they wanted. When they clicked on this, it made it easy to
do the right thing(displayed last values, relevant data, and
SmartSet to satisfy the measure with one mouseclick). Results (max
points 250): Include all results, using control charts, graphs or
tables as appropriate. From an outcomes perspective, on the 19
quality indicators, we saw statistically significant improvements
in 13(p<.05) and a trend towards improvement in another (p=.06;
See 3 results tables in Attachment 2 for details). This included
significant improvements in all 5 preventive services with absolute
improvements from 4% to 14.6%. Improvements were also seen for lab
monitoring (A1C, LDL, nephropathy) and medication
use(antiplatelets, beta-blockers, ACEI/ARBs). No improvements were
seen on A1C, LDL or BP control which were not explicitly targeted
with improvement strategies by the BPAs. Exceeding national
benchmarks increased from 8/19 measures pre-intervention to 12/19
post-intervention. From a process perspective, post-
intervention over 40,000 BPAs triggered and over 5,000 exceptions
were documented. Exception reporting improved performance in
several areas. For example, colorectal cancer screening was 70.7%
pre-intervention, improved to 73.9% after BPA activation, and
further increased to 84% factoring in exception data(p<.05).
Revenue Enhancement /Cost Avoidance / Generalizability (max points
200): What is the revenue enhancement /cost avoidance and/or
savings for your project? Did you implement this project in
multiple sites after determining that your change was successful?
This project has had several implications for revenue enhancement.
First, it enabled us to obtain Level 3 NCQA Patient-Centered
Medical Home accreditation, and we are now negotiating
shared-saving contracts with two commercial insurers for the
upcoming year based on this. Second, it led to recognition in the
national Bridges to Excellence in diabetes program which yielded a
$15,000 bonus to the practice, and recognition in Blue Cross/Blue
Shield of Texas’ Diabetes P4P program that will yield an additional
quality bonus of $100 per patient meeting benchmark/year (estimated
to yield $25,000- 50,000/year). We are in the process of applying
for the sister, national Bridges to Excellence P4P program in heart
disease that ought to yield similar financial benefits. The
institution is also planning on disseminating the diabetes and
heart disease BPAs to the Endocrinology and Cardiology clinics, and
applying for Bridges to Excellence P4P programs in these clinics.
These accomplishments should also yield quality bonuses from the
CMS PQRI program. The institution is also planning on rolling out
these IT- enhancements to other primary care practices that UTSW is
looking to bring into an affiliated community practice network that
would use our Epic EHR. Finally, this suite of ambulatory Epic
enhancements and BPAs are being implemented in the 11 community-
oriented primary care clinics run by Parkland Hospital and Health
System, our affiliated safety net provider. Conclusions and Next
Steps: Describe your conclusions drawn from this project and any
recommendations for future work. How does project align with
organizational goals? Describe, as applicable, how you plan to move
ahead with this project. Many of the next steps are outlined above
with regard to dissemination to other UTSW and affiliated clinics.
We also plan on continuing to further refine the visit-based
electronic case identification algorithms and BPAs. Our next phase
will be also add a population management component to the program
using the practice-based quality reports to proactively reach out
to patients not meeting quality indicators using our practice care
managers and nurses. Additionally, the multidisciplinary project
team will evaluate, prioritize, and design a second set of
IT-enabled quality measures and BPA decision support tools.
Institutionalization of two aspects of the project will further
ensure sustainability. The Epic programming for future BPAs has now
shifted to a UTSW IT employee on the meaningful use team and the
data analysis has shifted to a programmer in our Office of Quality
and Safety (from grant funding). Total abstract word count:
1493
E P IC P a g e 1 o f 7 D a t a S o u r c e
N a t i o n a l m e a n s r e p o r t e d b y N a t i o n a l C o m
m i t t e e f o r Q u a l i t y A s s u r a n c e ( N C Q A ) H e a
l t h c a r e E f f e c t i v e n e s s D a t a a n d I n f o r m a
t i o n S e t ( H E D I S ) f o r t h e y e a r 2 0 0 9 / 2 0 1 0 .
B P m e a s u r e m e n t n a t i o n a l a v e r a g e l i s t e d
i s p r e f e r r e d p r a c t i c e s t a n d a r d .
N a t i o n a l A v e r a g e s D i a b e t e s : D i a g n o s i s
c o n f i r m e d b y t h e p r e s e n c e o f I C D 9 2 5 0 . X X
, 3 5 7 . 2 , 3 6 2 . 0 X , o r 3 6 6 . 4 1 i n t h e e n c o u n t
e r d i a g n o s e s ( p a s t 5 y e a r s ) , p r o b l e m l i s
t o r m e d i c a l h i s t o r y r e c o r d s .
A s s i g n e d P h y s i c i a n : P C P w i t h w h o m t h e p a
t i e n t h a d 2 o r m o r e v i s i t s d u r i n g t h e t w o y
e a r p e r i o d . I f a p a t i e n t c h a n g e d P C P s d u r
i n g t h e t w o y e a r p e r i o d a n d h a s 2 o r m o r e v i
s i t s w i t h m o r e t h a n o n e P C P , p a t i e n t w a s a
s s i g n e d t o t h e P C P w i t h t h e m o s t v i s i t s . I
f p a t i e n t w a s s e e n e q u a l n u m b e r o f t i m e s b
y o n e o r m o r e P C P s , p a t i e n t w a s a s s i g n e d t
o P C P a s o f l a s t v i s i t .
E s t a b l i s h e d P a t i e n t : P a t i e n t w i t h 2 o r m
o r e o f f i c e v i s i t s w i t h t h e s a m e P C P i n t w o
y e a r s ( J u l - 2 0 0 9 t o J u n - 2 0 1 1 ) w i t h a t l e a
s t o n e v i s i t i n t h e l a s t 1 2 m o n t h s ( J u l - 2 0
1 0 t o J u n - 2 0 1 1 ) . K e y D e f i n i t i o n s A l l e s t
a b l i s h e d p a t i e n t s , a g e s 1 8 - 7 5 y e a r s , w i
t h a d i a g n o s i s o f d i a b e t e s . I n c l u s i o n C r
i t e r i a
O f f i c e o f Q u a l i t y I m p r o v e m e n t & S a f e t
y D R A F T P a t i e n t C e n t e r e d M e d i c a l H o m e C
li n i c a l O u t c o m e s - D i a b e t e s M a n a g e m e n t
P r e p a r e d b y : O f f i c e O f Q u a l i t y I m p r o v e m
e n t & S a f e t y a s o f 2 1 M A Y 2 0 1 2
Diabetes Management EPIC Provider ID: 1234 Provider Name: Provider
Last Name, First Name
Reporting Period: Jul-2010 to Jun-2011
GIM Practice Provider: 1234 NCQA
Measure Number Eligible
Met National Average
HgbA1c measured in the last 12 months 1607 88.4 126 92.1 89.3
HgbA1c < 8% (denominator: all diabetics) 1607 70.2 126 77.0
62.4
HgbA1c < 8% (denominator: all diabetics with non-missing values)
1421 79.4 116 83.6 .
LDL-C measured in the last 12 months 1607 81.7 126 88.1 85.9
LDL-C < 100mg/dl (denominator: all diabetics) 1607 55.3 126 51.6
48.2
LDL-C < 100mg/dl (denominator: all diabetics with non-missing
values) 1313 67.6 111 58.6 .
BP measured in the last 12 months 1607 97.1 126 99.2 95.0
BP < 130/80mmHg (denominator: all diabetics) 1607 37.7 126 38.1
33.6
BP < 130/80mmHg (denominator: all diabetics with non-missing
values) 1560 38.8 125 38.4 .
Medical Attention for Nephropathy in the last 12 months 1607 83.1
126 90.5 85.3
1 > 1 % A b o v e N a t i o n a l A v e r a g e
2 < 1 % B e l o w N a t i o n a l A v e r a g e
3 + / - 1 % o f N a t i o n a l A v e r a g e
dbhat
Diabetes Management Pre-BPA Implementation vs. Post-BPA
Implementation
3.0 0.0180 * 89.3 1.9 0.4147 62.4 0.1 0.7875 . 6.4 <0.0001 *
85.9 3.8 0.0194 * 48.2 0.1 0.5224 . 0.7 0.1083 95.0
-1.2 0.4101 33.6 -1.5 0.3285 . 4.2 <0.0001 * 85.3
Pre-BPA Implementation Period: Jul-2010 to Jun-2011 Post-BPA
Implementation Period: Jul-2011 to Jun-2012
* P < 0.05 for differences between post-BPA (exceptions
considered) and pre-BPA outcomes.
Please note: BP measurement and control were not included in the
project-related interventions.
NCQA National Average (Pre- BPA Period)
BP < 130/80mmHg (denominator: all diabetics with non-missing
values) 1560 38.8 1717 37.6
LDL-C < 100mg/dl (denominator: all diabetics with non-missing
values) 1313 67.6 1526 67.8 BP measured in the last 12 months 1607
97.1 1756 97.8
Medical Attention for Nephropathy in the last 12 months 1607 83.1
1756 86.4
BP < 130/80mmHg (denominator: all diabetics) 1607 37.7 1756
36.7
LDL-C measured in the last 12 months 1607 81.7 1756 86.9 LDL-C <
100mg/dl (denominator: all diabetics) 1607 55.3 1756 58.9
HgbA1c < 8% (denominator: all diabetics) 1607 70.2 1756 71.9
HgbA1c < 8% (denominator: all diabetics with non-missing values)
1421 79.4 1586 79.6
Pre-BPA Post-BPA (No Exceptions
% Criteria Met
Number Eligible
% Criteria Met
91.4HgbA1c measured in the last 12 months 1607 88.4 1756 90.3
1729 87.3
Difference in % Completion Rates: Post-BPA (Exceptions Considered)
- Pre-BPA
P-value
Post-BPA (Exceptions Considered)
Number Eligible
% Criteria Met
Number Eligible
% Criteria Met
1543 79.5 1477 81.2 3.3 0.0050 * .
1543 79.5 1472 81.5 5.5 0.0005 * 88.4 1543 57.5 1472 59.3 2.8
0.1166 57.6 1226 72.3 1186 73.6 -0.7 0.7370 .
1543 95.9 1489 95.8 0.7 0.4094 95.0 1543 69.4 1489 69.6 -0.2 0.9244
.
1480 72.4 1427 72.6 -0.8 0.6426 .
343 73.5 328 76.5 8.7 0.0012 * 77.9 572 79.0 542 81.4 3.9 0.0633
.
Pre-BPA Implementation Period: Jul-2010 to Jun-2011 Post-BPA
Implementation Period: Jul-2011 to Jun-2012
* P < 0.05 for differences between post-BPA (exceptions
considered) and pre-BPA outcomes.
Please note: BP measurement and control were not included in the
project-related interventions.
CHD: Antiplatelet Therapy 1385 77.9 CHD: LDL-C measured in the last
12 months
Pre-BPA
CHD with DM: ACEi/ARB Therapy 489
CHD: LDL-C < 100mg/dl (denominator: all CHD patients with
non-missing values) 1052 CHD: BP measured in the last 12 months
1385 CHD: BP < 140/90 mmHg (denominator: all CHD patients) 1385
CHD: BP < 140/90 mmHg (denominator: all CHD patients with
non-missing values) 1317
CHD: LDL-C < 100mg/dl (denominator: all CHD patients) 1385
73.4 CHD with MI: Beta Blocker Therapy 292 67.8
77.5
69.8 95.1 74.3
Post-BPA ( Exceptions Considered)
Post-BPA (No Exceptions
Measure Number Eligible
% Criteria Met
Number Eligible
% Criteria Met
Screening Mammography, women ages 40 to 69 years 3409 71.5 3646
69.8 3579 75.5 4.0 <0.0001 * 70.4 Pneumococcal Vaccination, ages
65+ years 4163 66.7 4868 72.8 4787 81.3 14.6 <0.0001 * 82.0
Colorectal Cancer Screening, ages 50 to 75 years 5748 70.7 6377
73.9 6330 84.0 13.3 <0.0001 * 58.3 Osteoporosis Screening, women
ages 65+ years 2439 72.4 2855 76.4 2804 79.9 7.5 <0.0001 * 68.0
Cervical Cancer Screening, women ages 21 to 64 years 3204 53.6 3366
53.7 3285 62.3 8.7 <0.0001 * 74.2
Pre-BPA Implementation Period: Jul-2010 to Jun-2011 Post-BPA
Implementation Period: Jul-2011 to Jun-2012
* P < 0.05 for differences between post-BPA (exceptions and
overrides considered) and pre-BPA outcomes.
Post-BPA (Exceptions Considered
Difference in % Completion Rates: Post-BPA (Exceptions Considered
and HM Overrides Considered) - Pre-BPA
P-value NCQA National Average (Pre-BPA Period)
Pre-BPA Post-BPA (No Exceptions
dbhat
Attachment 2 contd.
Fishbone Analysis of the possible reasons for missing HgbA1c and
LDL-c tests among diabetics
Pareto Chart analyzing the reasons for missing HgbA1c and LDL-c
tests among diabetics using data collected by manual chart
review.
dbhat
ONLY
Mis-Fire, ie why did BPA appear?
Patient Removed from BPA QI Data
Permanent Exemption
BPA Off
Data
Patient Remains in
Managed by other MD
This POS Visit Not Included in MDs QI
Data
Examples:
Data
Period
Attachment 2 Post-BPA Analysis of Diabetes, CHD, and Preventive
Screening Outcomes
PostBPA_DiabetesOutcomes
Attachment 4 Decision Tree
CSE Abstract UTSW Epic IT BPA project-3.pdf
Submission is limited to a maximum word count of 1500 (not
including text in graphs).