Automating Revenue Integrity During the ICD-10 … · Automating Revenue Integrity During the...
Transcript of Automating Revenue Integrity During the ICD-10 … · Automating Revenue Integrity During the...
• Anne Robertucci, UPMC
• Eileen Simmons, UPMC Presbyterian Shadyside
Automating Revenue Integrity During the ICD-10 Transition and Beyond
UPMC Snapshot
$11 billion integrated global health
enterprise
Western PA’s largest employer
60,000 employees
22 academic, community, and
regional hospitals 4,732 licensed beds
187,000+ inpatient admissions
174,000 surgeries performed annually
4.5 million+ outpatient visits
600,000 emergency visits
40+ UPMC Cancer Centers
180 affiliated oncologists
UPMC Health Plan: 2 million total
members network of 125+ hospitals
11,500+ physicians
Clinical Documentation Improvement at
UPMC: The Challenge
• No clinical documentation
improvement (CDI) programs in place
• 100% retrospective focus
• Average 550 inpatient medical
records coded per day
• 5% of the total discharges result
in a query
• Physician query process is
labor intensive
• Queries that are not resolved quickly
impact the DNFB
• 5 FTEs/3 sites
CONCURRENT
100% RETRO
5% = $1M/month
550 records/day
Balancing Organizational Approach with
Physician Needs
Hospital Physicians
Provide education
Clinically relevant
Embedded query workflow
Concurrent querying
FTE neutral solution
Improved outcomes
Compliant practices
Eliminate missed opportunities
• Provides a natural language processing (NLP) engine for case-finding by locating cases with
the highest potential for documentation issues within them.
• Eliminates the use of spreadsheets, admission reviews, and other manual methods for
locating cases.
• Decreases the number of chart reviews required to produce a valid query and the time it
takes to review a medical record to validate the presence of a CDI opportunity.
• Develops more compliant queries by providing system references which are based upon the
CDI case finding rules with information already in the medical record.
• With improved precision and recall, allows for a percentage of all queries to be auto-
generated directly to the physician using the auto-query functionality.
• Increases quality scores and metrics for better patient care.
Beyond the EHR - CDI Software and Services
The Tools and the Team
• 9 facilities/15 FTE’s
• Automation reduces staffing requirements for case finding (via
NLP engine)
• Electronic work lists linked directly to electronic health record information
• Work distribution seamless/remote work options for CDI specialists
• Remote physician advisor review work options
• Tool enables CDI Specialist to tag key record components for the provider
to review seamlessly during query completion
• Electronic query capability within physician workflow reduces response
time and increases physician satisfaction
• Integration of coding within CDI workflow
Striving Toward a Field of CDI
Dreams – The UPMC Story
Co-developed the first
inpatient computer-
assisted coding (CAC)
solution, launched in 2008.
CMI by 8%
Coder productivity by 20%
External audits by 50%
Coder overtime
CDI Technology — How it Works
Concurrent CDI Case Finding
Continuous processing of EMR data through NLP engine to code and apply case-finding rules to each admission
If a case is marked for CDI, ensure that it conforms to business rules for presentation to a user:
• Financial class
• Revenue code
• Physician service
• Location
How should it be routed?
• Direct to physician
• Peer advisor
• CDI specialist/manager
• Specific user/coder
Business Rules Logic
Query passively built with minimal (if any) additional editing and update required by CDIS
Presentation to physician either interfaced to EMR via Inbox or PQRT Portal
Passive Query Building
Query response returned to NLP
Two Types of CDI Opportunities Test
Optum NLP Capability
Example 1: Specificity
Physician documents “CHF improving.”
Example 2: Clinical Clarity
Physician documents “fluid retention and
shortness of breath improving.”
NLP Identifies
• “CHF” in history and physical
• “CHF” in progress note
• Suggests code for unspecified CHF
NLP Identifies
• Pulmonary vascular congestion in CXR
• Ejection fraction of <30% in echo
• BNP of 700
• IV Lasix in MAR
Approach to Query
• Engage physician to provide specificity in
CHF diagnosis
– Acute vs Chronic
– Diastolic vs Systolic
Approach to Query
• Engage physician to clarify clinical facts
• Ascertain if there is a diagnosis that could be added to
reflect the clinical picture and rationale for treatment of
this patient
• Subsequent query for specificity in diagnosis if indicated
Discrete
data
High
difficulty
Easy to
moderate
The Challenge
(or business goal)
System-built
queries vs.
manually-built
Transitioning to the New
Technology Solution
• Inpatient documentation review (i.e. queries); NLP automation
opportunities identified
• CDI classroom education for re-tooled DRG Specialists
• Clinical indication training specific for coders to understand markers
• Interface testing – key to success is data!!! (28+ interfaces)
• Application use-case testing with super users
• Technical training of providers and re-tooled DRG Specialists
• Implementation rollout by facility
• Consistent testing and adherence to project plan repeated for each rollout
• Monitoring of key metrics….
The Challenge
(or business goal)
UPMC Optum
CDI Outcomes
Optum CDI Queries Yielding ROI
Most frequent service
lines queried:
• Cardiology
• Medicine
• Oncology
Physician Query Response Rates
Comparing Paper Query to Optum CDI
Physician Query Average Turnaround
Time Comparing Paper Query Process
vs. Optum CDI
Coding TAT to Final Bill
Cases with a Query 2012 vs. 2013 (average, days)
UPMC CC/MCC Capture Improvement
• Electronic CDI Improvements
– 3% Improvement CC/MCC at Presbyterian/Shadyside Hospitals
– 4% Improvement CC/MCC at St. Margaret Hospital
25% 28% 17% 21%
31% 31%
32% 32%
Manual CDI Optum CDI Manual CDI Optum CDI MCC CC
Presbyterian/Shadyside St. Margaret
Physician Query-Related DRG Shift
105 96
41
9 3 7 2
26
133
42 32
9 14 7
0
20
40
60
80
100
120
140
A: < 1 B: 1 - 2 C : 2 - 3 D: 3 - 4 E: 4 - 5 F: 5 - 6 G: > 6
Original DRG Revised DRG
DRG shift related to queries that changed the MSDRG
Pilot: Value of Automated CDI to UPMC
Impact of
more accurate
documentation
$1,321,880 ÷ 5359 cases = $247/case
UPMC performed an audit of the
Clinical Documentation
Improvement (CDI) physician
queries for all discharged patient
records from UPMC
Presbyterian, Shadyside and St.
Margaret Hospitals from
November 10, 2013, through
December 8, 2013.
A total of 5,359 discharges and
corresponding queries were
evaluated which provided a total
value of $1,321,880 in expected
additional revenue. An
extrapolated value per discharge
results in $247 per case. $2.1M $2.6M
$3.4M $3.6M
$5.8M
$7.7M
$3.9M $3.7M
$32.8M*
Projected annual income
UPMC: All hospitals $32.8M
Presbyterian McKeesport/ Horizon/
Northwest
Mercy Hamot/ Bedford
St. Margaret Passavant/ East
Magee
Shadyside All UPMC
Roadmap to ICD-10
• Tool enables use of technology for creation of query templates required with ICD-10 implementation
• CDI Specialists and Coders ‘training’ sandbox in one integrated solution
• UPMC utilizing ICD-10 trained coders to serve as feedback loop for physician documentation issues identified during dual coding period
• Dual coding enables NLP engine to further enhance ICD-10 precision and recall rates
Automated CDI and Quality
• Virtual work lists enable UPMC CDI Specialists to query concurrently for SOI/ROM measures as well as for MS-DRG opportunities
• SOI/ROM scores integrated within their workflow; Coder and CDI Specialist roles both have direct access to weighting and scores
• SOI/ROM directly impact hospital HealthGrade reports as well as physician OPPE benchmarks
Conclusions
• Opportunities are bountiful
• Technology opens door to efficiency
• Metrics are key to monitoring progress
• Journey just beginning for us…
Thank you!