Amit V. Trivedi

50
Test Results Summary for 2014 Edition EHR Certification Version 1.0 December 13, 2015 Page 1 of 14 Template Version 1 12/13/2015 10:41 AM ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Note: Product Update from CHPL ID 140212R01, HealthTronics Information Technology Solutions was acquired by IntrinsiQ Specialty Solutions, an AmerisourceBergen Company Part 1: Product and Developer Information 1.1 Certified Product Information Product Name: meridianSPECIALTY System Product Version: 7.1.10 Domain: Ambulatory Test Type: Modular EHR 1.2 Developer/Vendor Information Developer/Vendor Name: IntrinsiQ Specialty Solutions Address: 3101 Gaylord Parkway Frisco, TX 75034 Website: www.intirinsiq.com Email: [email protected] Phone: 1-877-570-8721 Developer/Vendor Contact: Andrew Scott Part 2: ONC-Authorized Certification Body Information 2.1 ONC-Authorized Certification Body Information ONC-ACB Name: ICSA Labs, an independent division of Verizon Address: 1000 Bent Creek Blvd, Suite 200 Mechanicsburg, PA 17050 Website: https://www.icsalabs.com/technology-program/onc-ehr Email: [email protected] Phone: 717.790.8100 ONC-ACB Contact: Amit Trivedi This test results summary is approved for public release by the following ONC-Authorized Certification Body Representative: Amit Trivedi Program Manager – Healthcare ONC-ACB Authorized Representative Function/Title Signature and Date Amit V. Trivedi 12/13/2015

Transcript of Amit V. Trivedi

Page 1: Amit V. Trivedi

Test Results Summary for 2014 Edition EHR Certification Version 1.0 December 13, 2015

Page 1 of 14

Template Version 1 12/13/2015 10:41 AM

ONC HIT Certification Program

Test Results Summary for 2014 Edition EHR Certification

Note: Product Update from CHPL ID 140212R01, HealthTronics Information Technology Solutions was acquired by IntrinsiQ Specialty Solutions, an AmerisourceBergen Company Part 1: Product and Developer Information

1.1 Certified Product Information

Product Name: meridianSPECIALTY System

Product Version: 7.1.10

Domain: Ambulatory

Test Type: Modular EHR

1.2 Developer/Vendor Information

Developer/Vendor Name: IntrinsiQ Specialty Solutions

Address: 3101 Gaylord Parkway Frisco, TX 75034

Website: www.intirinsiq.com

Email: [email protected]

Phone: 1-877-570-8721

Developer/Vendor Contact: Andrew Scott

Part 2: ONC-Authorized Certification Body Information

2.1 ONC-Authorized Certification Body Information

ONC-ACB Name: ICSA Labs, an independent division of Verizon

Address: 1000 Bent Creek Blvd, Suite 200 Mechanicsburg, PA 17050

Website: https://www.icsalabs.com/technology-program/onc-ehr

Email: [email protected]

Phone: 717.790.8100

ONC-ACB Contact: Amit Trivedi

This test results summary is approved for public release by the following ONC-Authorized Certification Body Representative:

Amit Trivedi

Program Manager – Healthcare ONC-ACB Authorized Representative Function/Title

Signature and Date

Amit V. Trivedi 12/13/2015

Page 2: Amit V. Trivedi

Test Results Summary for 2014 Edition EHR Certification Version 1.0 December 13, 2015

Page 2 of 14

Template Version 1 12/13/2015 10:41 AM

2.2 Gap Certification The following identifies criterion or criteria certified via gap certification

§170.314

(a)(1) (a)(17) (d)(5) (d)(9)

(a)(6) (b)(5)* (d)(6) (f)(1)

(a)(7) (d)(1) (d)(8)

*Gap certification allowed for Inpatient setting only

No gap certification

2.3 Inherited Certification The following identifies criterion or criteria certified via inherited certification

§170.314

(a)(1) (a)(14) (c)(3) (f)(1)

(a)(2) (a)(15) (d)(1) (f)(2)

(a)(3) (a)(16) Inpt. only (d)(2)

(f)(3)

(a)(4) (a)(17) Inpt. only (d)(3) (f)(4) Inpt. only

(a)(5) (b)(1) (d)(4)

(f)(5) Optional & Amb. only (a)(6) (b)(2) (d)(5)

(a)(7) (b)(3) (d)(6)

(f)(6) Optional & Amb. only (a)(8) (b)(4) (d)(7)

(a)(9) (b)(5) (d)(8) (g)(1)

(a)(10) (b)(6) Inpt. only (d)(9) Optional

(g)(2)

(a)(11) (b)(7) (e)(1) (g)(3)

(a)(12) (c)(1) (e)(2) Amb. only (g)(4)

(a)(13) (c)(2) (e)(3) Amb. only

No inherited certification

Page 3: Amit V. Trivedi

Test Results Summary for 2014 Edition EHR Certification Version 1.0 December 13, 2015

Page 3 of 14

Template Version 1 12/13/2015 10:41 AM

Part 3: NVLAP-Accredited Testing Laboratory Information Report Number: 2014-EHRA307338-2014-0731-00 Test Date(s): 7/11/2014, 7/14/2014, 7/24/2014, 7/28/2014, 7/31/2014

3.1 NVLAP-Accredited Testing Laboratory Information

ATL Name: ICSA Labs, an independent division of Verizon

Accreditation Number: 200697-0

Address: 1000 Bent Creek Boulevard, Suite 200 Mechanicsburg, PA 17050

Website: https://www.icsalabs.com/technology-program/onc-ehr

Email: [email protected]

Phone: 717.790.8100

ATL Contact: Michelle Knighton

For more information on scope of accreditation, please reference http://ts.nist.gov/standards/scopes/2006970.htm

3.2 Test Information

3.2.1 Additional Software Relied Upon for Certification

Additional Software Applicable Criteria Functionality provided by Additional Software

Surescripts ONC 314a10, ONC 314b3 Drug formulary checking and ePrescribing

Meinberg NTP ONC 314d2, ONC 314e1 Time synchronization

Qvera Interface Engine ONC 314b1, ONC 314b2, ONC 314b7, ONC 314e1, ONC 314e2

CCDA document creation; Direct secure messaging (inbound and outbound)

MedLine Plus ONC 314a8, ONC 314a15 Online Education Resources accessed using "InfoButton" standard

BI Clinicals ONC 314c1, ONC 314c2, ONC 314c3

Clinical Quality Measures

HealthTronics IT Solutions Patient Portal

ONC 314e1, ONC 314e3, ONC 314g2

Patient Portal for View, Download, Transmit and Secure Messaging. Automated Measures for Clinical Summaries, Secure Messaging and VDT

MedAllies ONC 314b1, ONC 314b2, ONC 314e1

HISP for Direct (VDT, Transition of Care objectives)

Cerner Multum Rx ONC 314a1, ONC 314a2, ONC 314a6, ONC 314a8, ONC 314b2, ONC 314b4, ONC 314b7, ONC 314e1, ONC 314e2

Medication content for CPOE, CCDA, and Clinical Information Reconciliation; interaction checking for CDS and Drug Formulary

Page 4: Amit V. Trivedi

Test Results Summary for 2014 Edition EHR Certification Version 1.0 December 13, 2015

Page 4 of 14

Template Version 1 12/13/2015 10:41 AM

Additional Software Applicable Criteria Functionality provided by

Additional Software Health Language ONC 314a5, ONC 314b2, ONC

314b4, ONC 314b7, ONC 314e1, ONC 314e2

Problem List

No additional software required

3.2.2 Test Tools

Test Tool Version

Cypress 2.4.1

ePrescribing Validation Tool 1.0.4

HL7 CDA Cancer Registry Reporting Validation Tool

HL7 v2 Electronic Laboratory Reporting (ELR) Validation Tool

HL7 v2 Immunization Information System (IIS) Reporting Validation Tool

HL7 v2 Laboratory Results Interface (LRI) Validation Tool 1.7.0

HL7 v2 Syndromic Surveillance Reporting Validation Tool Transport Testing Tool 179

Direct Certificate Discovery Tool 3.0

No test tools required

3.2.3 Test Data

Alteration (customization) to the test data was necessary and is described in Appendix A No alteration (customization) to the test data was necessary

3.2.4 Standards

3.2.4.1 Multiple Standards Permitted The following identifies the standard(s) that has been successfully tested where more than one standard is permitted

Criterion # Standard Successfully Tested

(a)(8)(i i)(A)(2)

§170.204(b)(1) HL7 Version 3 Implementation Guide: URL-Based Implementations of the Context-Aware Information Retrieval (Infobutton) Domain

§170.204(b)(2) HL7 Version 3 Implementation Guide: Context-Aware Knowledge Retrieval (Infobutton) Service-Oriented Architecture Implementation Guide

(a)(13)

§170.207(a)(3) IHTSDO SNOMED CT® International Release July 2012 and US Extension to SNOMED CT® March 2012 Release

§170.207(j) HL7 Version 3 Standard: Clinical Genomics; Pedigree

Page 5: Amit V. Trivedi

Test Results Summary for 2014 Edition EHR Certification Version 1.0 December 13, 2015

Page 5 of 14

Template Version 1 12/13/2015 10:41 AM

Criterion # Standard Successfully Tested

(a)(15)(i)

§170.204(b)(1) HL7 Version 3 Implementation Guide: URL-Based Implementations of the Context-Aware Information Retrieval (Infobutton) Domain

§170.204(b)(2) HL7 Version 3 Implementation Guide: Context-Aware Knowledge Retrieval (Infobutton) Service-Oriented Architecture Implementation Guide

(a)(16)(i i) §170.210(g)

Network Time Protocol Version 3 (RFC 1305)

§170. 210(g) Network Time Protocol Version 4 (RFC 5905)

(b)(2)(i)(A)

§170.207(i) The code set specified at 45 CFR 162.1002(c)(2) (ICD-10-CM) for the indicated conditions

§170.207(a)(3) IHTSDO SNOMED CT® International Release July 2012 and US Extension to SNOMED CT® March 2012 Release

(b)(7)(i)

§170.207(i) The code set specified at 45 CFR 162.1002(c)(2) (ICD-10-CM) for the indicated conditions

§170.207(a)(3) IHTSDO SNOMED CT® International Release July 2012 and US Extension to SNOMED CT® March 2012 Release

(e)(1)(i) Annex A of the FIPS Publication 140-2 • AES, 128, SHA-1

(e)(1)(i i)(A)(2) §170.210(g)

Network Time Protocol Version 3 (RFC 1305)

§170. 210(g) Network Time Protocol Version 4 (RFC 5905)

(e)(3)(i i) Annex A of the FIPS Publication 140-2 • AES, 128, SHA-1

Common MU Data Set (15)

§170.207(a)(3) IHTSDO SNOMED CT® International Release July 2012 and US Extension to SNOMED CT® March 2012 Release

§170.207(b)(2) The code set specified at 45 CFR 162.1002(a)(5) (HCPCS and CPT-4)

None of the criteria and corresponding standards l isted above are applicable

3.2.4.2 Newer Versions of Standards The following identifies the newer version of a minimum standard(s) that has been successfully tested

Newer Version Applicable Criteria

No newer version of a minimum standard was tested

Page 6: Amit V. Trivedi

Test Results Summary for 2014 Edition EHR Certification Version 1.0 December 13, 2015

Page 6 of 14

Template Version 1 12/13/2015 10:41 AM

Page 7: Amit V. Trivedi

Test Results Summary for 2014 Edition EHR Certification Version 1.0 December 13, 2015

Page 7 of 14

Template Version 1 12/13/2015 10:41 AM

3.2.5 Optional Functionality

Criterion # Optional Functionality Successfully Tested

(a)(4)(iii) Plot and display growth charts

(b)(1)(i)(B) Receive summary care record using the standards specified at §170.202(a) and (b) (Direct and XDM Validation)

(b)(1)(i)(C) Receive summary care record using the standards specified at §170.202(b) and (c) (SOAP Protocols)

(b)(2)(ii)(B) Transmit health information to a Third Party using the standards specified at §170.202(a) and (b) (Direct and XDM Validation)

(b)(2)(ii)(C) Transmit health information to a Third Party using the standards specified at §170.202(b) and (c) (SOAP Protocols)

(f)(3) Ambulatory setting only – Create syndrome-based public health surveillance information for transmission using the standard specified at §170.205(d)(3) (urgent care visit scenario)

Common MU Data Set (15)

Express Procedures according to the standard specified at §170.207(b)(3) (45 CFR162.1002(a)(4): Code on Dental Procedures and Nomenclature)

Common MU Data Set (15)

Express Procedures according to the standard specified at §170.207(b)(4) (45 CFR162.1002(c)(3): ICD-10-PCS)

No optional functionality tested

Page 8: Amit V. Trivedi

Test Results Summary for 2014 Edition EHR Certification Version 1.0 December 13, 2015

Page 8 of 14

Template Version 1 12/13/2015 10:41 AM

3.2.6 2014 Edition Certification Criteria* Successfully Tested

Criteria # Version

Criteria # Version

TP** TD*** TP TD

(a)(1) (c)(3) 1.7.1 2.4.1

(a)(2) (d)(1)

(a)(3) (d)(2) 1.5

(a)(4) 1.4 1.3 (d)(3) 1.3

(a)(5) 1.4 1.3 (d)(4) 1.3

(a)(6) (d)(5)

(a)(7) (d)(6)

(a)(8) 1.2 (d)(7) 1.2

(a)(9) 1.3 1.3 (d)(8)

(a)(10) (d)(9) Optional

(a)(11) 1.3

(e)(1) 1.8 1.5

(a)(12) (e)(2) Amb. only 1.2 1.6

(a)(13) 1.2 (e)(3) Amb. only 1.3

(a)(14) 1.2 (f)(1)

(a)(15) 1.5 (f)(2)

(a)(16) Inpt. only (f)(3)

(a)(17) Inpt. only (f)(4) Inpt. only

(b)(1) 1.7 1.4 (f)(5) Optional &

Amb. only (b)(2) 1.4 1.6

(b)(3) 1.4 1.0.4 (f)(6) Optional &

Amb. only (b)(4) 1.3 1.4

(b)(5) 1.4 1.7.0 (g)(1)

(b)(6) Inpt. only (g)(2) 1.8 2.0

(b)(7) 1.4 1.7 (g)(3) 1.3

(c)(1) 1.7.1 2.4.1 (g)(4) 1.2

(c)(2) 1.7.1 2.4.1

*For a l ist of the 2014 Edition Certification Criteria, please reference http://www.healthit.gov/certification (navigation: 2014 Edition Test Method) **Indicates the version number for the Test Procedure (TP) ***Indicates the version number for the Test Data (TD)

Page 9: Amit V. Trivedi

Test Results Summary for 2014 Edition EHR Certification Version 1.0 December 13, 2015

Page 9 of 14

Template Version 1 12/13/2015 10:41 AM

3.2.7 2014 Clinical Quality Measures*

Type of Clinical Quality Measures Successfully Tested: Ambulatory Inpatient No CQMs tested

*For a l ist of the 2014 Clinical Quality Measures, please reference http://www.cms.gov (navigation: 2014 Clinical Quality Measures)

Ambulatory CQMs CMS ID Version CMS ID Version CMS ID Version CMS ID Version

2 90 136 155

22 117 137 156 v2

50 122 138 v2 157

52 123 139 158

56 124 140 159

61 125 141 160

62 126 142 161

64 127 v2 143 163

65 128 144 164 v2

66 129 v3 145 165 v2

68 v3 130 v2 146 166 v3

69 v2 131 147 167

74 132 148 169

75 133 149 177

77 134 v2 153 179

82 135 154 182

Inpatient CQMs

CMS ID Version CMS ID Version CMS ID Version CMS ID Version

9 71 107 172

26 72 108 178

30 73 109 185

31 91 110 188

32 100 111 190

53 102 113

55 104 114

60 105 171

Page 10: Amit V. Trivedi

Test Results Summary for 2014 Edition EHR Certification Version 1.0 December 13, 2015

Page 10 of 14

Template Version 1 12/13/2015 10:41 AM

3.2.8 Automated Numerator Recording and Measure Calculation

3.2.8.1 Automated Numerator Recording

Automated Numerator Recording Successfully Tested

(a)(1) (a)(9) (a)(16) (b)(6)

(a)(3) (a)(11) (a)(17) (e)(1)

(a)(4) (a)(12) (b)(2) (e)(2)

(a)(5) (a)(13) (b)(3) (e)(3)

(a)(6) (a)(14) (b)(4)

(a)(7) (a)(15) (b)(5)

Automated Numerator Recording was not tested

3.2.8.2 Automated Measure Calculation

Automated Measure Calculation Successfully Tested

(a)(1) (a)(9) (a)(16) (b)(6)

(a)(3) (a)(11) (a)(17) (e)(1)

(a)(4) (a)(12) (b)(2) (e)(2)

(a)(5) (a)(13) (b)(3) (e)(3)

(a)(6) (a)(14) (b)(4)

(a)(7) (a)(15) (b)(5)

Automated Measure Calculation was not tested

3.2.9 Attestation

Attestation Forms (as applicable) Appendix

Safety-Enhanced Design* B

Quality Management System** C

Privacy and Security

*Required if any of the following were tested: (a)(1), (a)(2), (a)(6), (a)(7), (a)(8), (a)(16), (b)(3), (b)(4) **Required for every EHR product

Page 11: Amit V. Trivedi

Test Results Summary for 2014 Edition EHR Certification Version 1.0 December 13, 2015

Page 11 of 14

Template Version 1 12/13/2015 10:41 AM

3.3 Appendices

Appendix A: Test Data Alterations

The following deviations from the ONC-approved Test Data were utilized during certification testing:

Used updated medication allergy reactions and updated RxNorm codes for meds and med allergies

Used updated SNOMED-CT codes for problems and social history Substituted CVX 141 for CVX 88 for immunizations Used modified cognitive and functional statuses System does not allow drug formulary checking to be turned off System does not ePrescribe controlled substances System does not allow an eNote to be created without a visit

Appendix B: Safety-Enhanced Design Attestation

The following Safety-Enhanced Design attestation was submitted during certification testing:

1 170.314(g)(3) Safety-enhanced design

Identify if the EHR technology is scheduled to be tested for certification against this criterion. If not, proceed to the next section.

1.1 Identify which of the following criteria are scheduled to be tested or inherited for certification.

1.1.1 170.314(a)(1) Computerized provider order entry

1.1.2 170.314(a)(2) Drug-drug, drug-allergy interactions checks

1.1.3 170.314(a)(6) Medication l ist

1.1.4 170.314(a)(7) Medication allergy l ist

1.1.5 170.314(a)(8) Clinical decision support

1.1.6 170.314(a)(16) Electronic medication administration record (inpatient setting only)

1.1.7 170.314(b)(3) Electronic prescribing

1.1.8 170.314(b)(4) Clinical information reconciliation

1.2 Document the applied user-centered design (UCD) processes for each applicable EHR technology capability submitted for testing. Provide the name, description, and citation for all UCD processes used.

• If a single UCD process was used for applicable capabilities, it would only need to be identified once.

• If different UCD processes were applied to specific capabilities, be sure to indicate the criterion or criteria to which each UCD process applies.

• If a modified UCD process was used for any of the applicable capabilities, an outline and short description of the UCD process must be provided. The description must also include identifying any industry-standard UCD process upon which the modified UCD process was based.

Page 12: Amit V. Trivedi

Test Results Summary for 2014 Edition EHR Certification Version 1.0 December 13, 2015

Page 12 of 14

Template Version 1 12/13/2015 10:41 AM

Modified UCD process applied to applicable capabilities. See appended documents to test results summary.

1.3 Submit a Usability Test Report for each criterion you selected in Question 1.1. • Attach the Usability Test Report in a separate document. • Identify the name of the report(s) and any other supporting documentation materials in the

field below. If more than one report is submitted, specify which report applies to which criteria.

• Reports may be supplied in any format, though they must include the necessary information for all of the certification criteria submitted for testing and conform to the content and completion requirements of the Customized Common Industry Format Template for Electronic Health Record Usability Testing per NISTIR 7742. Failure to include all required elements will constitute automatic failure of the SED Attestation.

• The official NISTIR 7742 report template can be located at http://www.nist.gov/itl/hit/upload/LowryNISTIR-7742Customized_CIF_Template_for_EHR_Usability_Testing_Publicationl_Version-doc.pdf

See appended safety enhanced design document for usability test results.

Appendix C: Quality Management System Attestation

The following Quality Management System attestation was submitted during certification testing:

1 170.314(g)(4) Quality management system

1.1 If an industry standard QMS was used during the development, testing, implementation or maintenance of the EHR technology for any of the certification criteria, specify it/them by name (e.g. ISO 9001, IEC 62304, ISO 13485, 21 CFR Part 820, etc.). If an industry standard QMS was not used, please skip to Question 1.2.

N/A

1.2 If a modified or "home-grown" QMS was used during the development, testing, implementation or maintenance of the EHR technology for any of the certification criteria, include an outline and short description of the QMS, which could include identifying any industry-standard QMS upon which it was based and modifications to that standard. If a modified or “home-grown” QMS was not used, please skip to Question 1.3.

Page 13: Amit V. Trivedi

Test Results Summary for 2014 Edition EHR Certification Version 1.0 December 13, 2015

Page 13 of 14

Template Version 1 12/13/2015 10:41 AM

The objectives being presented for certification were developed following a “home-grown” Quality Management System that incorporates many aspects of the Agile with Scrum framework for quality design, development, and testing. An outline is below:

1. Features to be delivered are identified from regulatory need and discussions with internal stakeholders (Support, Implementation, Sales, and Clinical Consultants) and with various customers, including a Clinical Advisory Council. 2. The Product Manager finalizes the list of prioritized features to be included in a particular release, with consideration of regulatory requirements, value to the customers, and value to the business. 3. Each Agile team is comprised of a Design Analyst ("Product Owner"), Quality Assurance Analyst, and two or three Developers. When a piece of work is scheduled for a release and assigned to a functional Design Analyst and/or Technical Architect, the Analysts carry out their due diligence to understand the business problems, requirements, and necessary workflow and create user stories and lean documentation that will be used to communicate with the rest of the Agile team and stakeholders. The goal is to have at least two sprints' (one month's) worth of details already groomed with the team and ready for development. 4. The Agile team and Designer meet with stakeholders to understand the problems to be solved and identify the tasks that will need to be carried out to develop, perform unit testing, and carry out functional and automated testing of each user story at hand. The Agile team may refine the user stories in order to best prioritize and plan the work to be done. 5. The Agile team builds the functionality, with ongoing input from the functional Design Analyst and Architect. Unit and manual testing are performed in various environments. Automated testing is being established in the organization. For larger, complex designs, the Design Analyst walks through manual user-acceptance testing along with the team. Communication among multiple teams is ongoing to help ensure that dependencies and integration issues are identified and addressed as early as feasible. Bi- weekly demos are presented and recorded for internal stakeholders. 6. Regression and user-acceptance testing occur during the last one to two months of the project, during which daily "scrums" occur to triage issues found and remove impediments. During this period, as well, Beta customers are identified and Beta rollout is planned. 7. Beta customers are selected based on various criteria: Size, server setup, hosted versus not hosted, and other technical and implementation concerns; as well as consideration of whether the customer is likely to be able to dedicate time to thorough testing and is willing to give honest, constructive feedback. Defects are addressed and high- importance enhancements are completed during Beta, while less-important enhancements may be deferred until a later release. 8. After a Beta process and signoff from Beta customers, training manuals and methods are then generated. The final product is made Generally Available and rolled out to customers. A controlled version management process enables the organization to manage releases into production. Customer support is arranged in a tier structure to handle feedback from the user base. Issues are handled or elevated to a higher tier support until resolved.

1.3 If no QMS was used during the development, testing, implementation or maintenance of the EHR technology for any of the certification criteria, please state that.

N/A

Page 14: Amit V. Trivedi

Test Results Summary for 2014 Edition EHR Certification Version 1.0 December 13, 2015

Page 14 of 14

Template Version 1 12/13/2015 10:41 AM

Test Results Summary Document History

Version Description of Change Date 1.0 Original July 31, 2014

END OF DOCUMENT

Page 15: Amit V. Trivedi

Page 1 of 2

meridianEMR 2014 Safety-Enhanced Design Report for a1, a8, b7

User-Centered Design Process for meridianEMR Version 7.0

This process applies to the following objectives included in this application:

170.314(a)(1): CPOE

170.314(a)(8): Clinical Decision Support

170.314(b)(7): Clinical Information Reconciliation

Below is provided a description of the meridianEMR user-centered design (UCD) process used to create

the above-referenced functionality presented for certification, as well as the rest of the 2014

Meaningful Use-related changes in meridianEMR.

The meridianEMR Product Management team is staffed by a Director of Product Management and

several Business Analysts that serve as Product Owners for different areas of the software. The lead

Product Owners are trained in human factors engineering, usability, and user-centered design, including

experience using primary and secondary personas and usability testing (both formal and informal) to

better understand and address the users’ needs.

meridianEMR’s UCD is a home-grown process, not based on any one specific UCD methodology. The

goal of this document is to:

Summarize this UCD and how users are involved in the determination of software content and

functionality; and

Illustrate how we have begun incorporating usability testing into the process.

The meridianEMR Product Management plans the content of releases by evaluating various user inputs,

including requests by customers for enhancements and usability improvements. When the Product

Manager assigns a feature to a Product Owner, the Product Owner discusses the request with customers

and other stakeholders to understand the business problem. When possible, the Design Analyst visits

customer sites to discuss the business need with physicians, clinical staff and back-office personnel, as

well as observing the users’ surroundings for clues into what the users need to have easily at hand. For

example, paper or sticky notes posted on and around a users’ computer monitor, and files that are kept

on a desk, give valuable clues into the types of information that need to be immediately accessible in

the software. These strategies, and others, help the Design Analyst gain a greater understanding of the

user needs, workflows, and work environments, in order to identify and prioritize the actors and use

cases that must be addressed by this feature, as well as those that may be candidates for usability

Page 16: Amit V. Trivedi

Page 2 of 2

meridianEMR 2014 Safety-Enhanced Design Report for a1, a8, b7

testing. At times when an on-site visit is not feasible, the Design Analyst will enhance this research by

discussing the business problems and potential solutions on phone calls with customers.

During and after this research, the Design Analyst mocks up a rough user interface concept, either in

wireframe or using MS Paint or Adobe Photoshop, and creates documentation that may also include

user workflow descriptions and acceptance criteria. The user interface concepts are based on human

behavior principles and existing design patterns in meridianEMR.

The meridianEMR UCD process is evolving to incorporate greater, ongoing user involvement and formal

usability testing. For the version 5.8, 5.9, and 7.0 functionality, Product Management established a

cadence of user group presentations that presented design concepts and invited customers to give early

feedback.

For the objectives described in 170.314(g)(3), usability testing was performed with users that closely

represent the user population for those features. As described in our accompanying Usability Test

Report for meridianEMR version 7.0 and previously submitted test report for version 5.8, the users that

participated in usability testing received minimal introduction to the goals of the test and to any new

functionality. They were asked to figure out how to carry out a set of defined tasks without assistance.

This type of testing helps the design team understand what users find familiar, easy to learn, and

comfortable to learn.

The meridianEMR UCD process will continue to evolve past this release, as the team continues to

increase user involvement in planning and designing workflow to improve patient safety and efficiency.

The Product Management team plans to establish usability testing as an integral part of our

development and design process ongoing.

In summary, the meridianEMR UCD process is a home-grown process that has always involved some

measure of user input and consideration of users’ needs and enhancement requests. With our 5.8, 5.9,

and 7.0 releases, we have begun incorporating usability testing into the process and plan to continue

and refine our usability testing as our UCD process continues to evolve.

Page 17: Amit V. Trivedi

Page 1 of 18

EHR Usability Test Report of meridianEMR Version 5.8 Report applies to:

• 170.314(a)(1): Computerized Provider Order Entry • 170.314(a)(2): Drug-drug, drug-allergy interaction checks • 170.314(a)(6): Medication List • 170.314(a)(7): Medication Allergy List • 170.314(b)(3): Electronic Prescribing

Report based on ISO/IEC 25062:2006 Common Industry Format for Usability Test Reports

meridianEMR version 5.8

Date of Usability Test: January 15 & January 22, 2014 Date of Report: April 16, 2014 Report Prepared By: meridianEMR Product Management Mary Thompson, Senior Business Analyst [email protected] 1701 S. Enterprise, Suite 104 Springfield, MO 65804

Contents EXECUTIVE SUMMARY .................................................................................................................................. 3

Major findings ........................................................................................................................................... 4

Areas for Improvement ............................................................................................................................. 5

Introduction .................................................................................................................................................. 6

Method ......................................................................................................................................................... 7

Participants ............................................................................................................................................... 7

Study Design.............................................................................................................................................. 7

Tasks .......................................................................................................................................................... 7

Procedures ................................................................................................................................................ 8

Test Location and Environment ................................................................................................................ 9

Test Forms and Tools ................................................................................................................................ 9

MeridianEMR 2014 Safety-Enhanced Design Report for a1, a2, a6, a7, b3

Page 18: Amit V. Trivedi

Page 2 of 18

Participant Instructions ............................................................................................................................. 9

Usability Metrics ....................................................................................................................................... 9

Data Scoring ................................................................................................................................................ 10

Results ......................................................................................................................................................... 11

Data Analysis and Reporting ................................................................................................................... 11

Discussion of the Findings ....................................................................................................................... 12

EFFECTIVENESS ................................................................................................................................... 12

EFFICIENCY .......................................................................................................................................... 12

SATISFACTION ..................................................................................................................................... 12

MAJOR FINDINGS ................................................................................................................................ 13

AREAS FOR IMPROVEMENT ................................................................................................................ 13

Appendices .................................................................................................................................................. 14

Appendix 1: Instructions and Introduction Provided to Customers Before or At the Start of Usability Testing ..................................................................................................................................................... 14

Appendix 2: Feedback Requested from Customers ................................................................................ 15

Appendix 3: meridianEMR User-Centered Design Process ..................................................................... 15

Appendix 4: Additional Plans for Usability Testing to Meet the 170.314(g)(3) Certification Requirements .......................................................................................................................................... 18

MeridianEMR 2014 Safety-Enhanced Design Report for a1, a2, a6, a7, b3

Page 19: Amit V. Trivedi

Page 3 of 18

EXECUTIVE SUMMARY A usability test of MeridianEMR version 5.8, a urology EHR, was conducted on January 15 and 22, 2014 remotely with representative clinical users of MeridianEMR. The purpose of this test was to review and validate the usability of the current user interface and to provide evidence of usability in the EHR Under Test (EHRUT).

During the usability test, 2 healthcare providers matching the target demographic criteria served as participants and used the EHRUT in simulated, but representative tasks.

This study collected performance data on the following tasks typically conducted using an EHR:

• Document that the patient has no known drug allergies (NKDA) • Document medication allergies in the patient record • Add medication history to a patient’s record, including one free-text (not codified)

medication • Prescribe new medications, including an electronic prescription • View interactions triggered by the combination of allergies and medications • Address interaction warnings • Discontinue medications using two separate methods • View patient’s medication history • Carry out Computerized Provider Order Entry (CPOE) for medications, lab orders and

radiology orders

During the 60-minute one-on-one usability test, each participant was greeted by the administrator. They were provided with information about the reasons and goals of the test ahead of time, and they were instructed that they could withdraw at any time. Participants had prior experience with the EHR. The administrator introduced the test, and instructed participants to complete a series of tasks (given one at a time) using the EHRUT. During the testing, the administrator timed the test and, along with the data logger(s) recorded user performance data on paper and electronically. The administrator did not give the participant assistance in how to complete the task.

Participant screens and audio were recorded for subsequent analysis.

The following types of data were collected for each participant:

• Number of tasks successfully completed within the allotted time without assistance • Approximate time to complete the tasks • Number and types of errors • Path deviations

MeridianEMR 2014 Safety-Enhanced Design Report for a1, a2, a6, a7, b3

Page 20: Amit V. Trivedi

Page 4 of 18

• Participant’s verbalizations (comments) • Participant’s satisfaction ratings of the system

All participant data was de-identified in this report – no correspondence could be made from the identity of the participant to the data collected. Following the conclusion of the testing, participants were asked to complete a post-test questionnaire and were compensated with a free hour of MeridianEMR training for their time. Below is a summary of the performance and rating data collected on the EHRUT. It is collected from verbal feedback about the testing steps and MeridianEMR, as well as observations made by the MeridianEMR staff members that participated in the test.

Task Clinical Team Member Participant

Clinical Team Member Participant

Document that the patient has No Known Drug Allergies (NKDA)

Successful Successful

Document medication allergies in the patient record

Successful Successful

Add medication history to a patient’s record, including one free-text (not codified) medication

Successful

Failure: unable to remember how to free-text medication

Prescribe new medications and submit electronically

Successful Successful

View interactions triggered by the combination of allergies and medications

Successful Successful

Address interaction warnings Successful Successful Discontinue medications using two separate methods

Successful Successful

View patient’s medication history

Successful Successful

Enter CPOE for medications, lab orders and radiology orders

Successful Failure: unable to remember how to indicate CPOE

Major findings • One user tested had a hard time following the instruction to enter a “free-text medication”, but

this was outside the scope of her daily activities. • Users did not display or verbalize any challenges with NKDA or medication allergies or reviewing

interaction warnings. • Users generally understood what CPOE and knew how to enter orders, but they were very

confused by the CMS definition when we shared it with them during the introduction to the test.

MeridianEMR 2014 Safety-Enhanced Design Report for a1, a2, a6, a7, b3

Page 21: Amit V. Trivedi

Page 5 of 18 Areas for Improvement

• None of the users documented start date for allergies, and only one documented a reaction. This suggests some need for user education regarding the value of this information in the EMR and for data exchange. If customers agree that this information is important, we need to evaluate whether to include a way to enter this information when the user is selecting an allergen.

• Functionality related to how MeridianEMR calculates the numerator for CPOE, which is based on the CMS definition of CPOE, needs to be carefully documented by our Technical Writing team and communicated by our Services and Support teams, so that our users understand how the measure is calculated and how their workflow needs to be adjusted in order to accommodate this.

MeridianEMR 2014 Safety-Enhanced Design Report for a1, a2, a6, a7, b3

Page 22: Amit V. Trivedi

Page 6 of 18

Introduction The EHRUT tested for this study was MeridianEMR, version 5.8. Designed to present medical information to physicians and their staffs, the EHRUT is an EHR used exclusively in ambulatory practices and including the typical functions available in such systems, including the ability to view, capture, and update medical history and office visits, with interfaces to outside Practice Management and Billing systems. The usability testing attempted to represent realistic exercises and conditions.

The purpose of this study was to test and validate the usability of the current user interface, and provide evidence of usability in the EHR Under Test (EHRUT). To this end, measures of effectiveness, efficiency and user satisfaction, such as ease and speed of allergy and medication entry, were captured during the usability testing.

MeridianEMR 2014 Safety-Enhanced Design Report for a1, a2, a6, a7, b3

Page 23: Amit V. Trivedi

Page 7 of 18

Method

Participants A total of 2 participants were tested on the EHRUT(s). Each participant is a current end user of the EHRUT and fills a clinical role:

• Clinical office user who also acts as a “super user” of the software (female) • Clinical office user who also fills a Medical Assistant role (female)

Participants were recruited by meridianEMR staff and were compensated with one hour of education time on meridianEMR. In addition, participants had no direct connection to the development of or organization producing the EHRUT(s). Participants were not from the testing or supplier organization.

Study Design Overall, the objective of this test was to uncover areas where the application performed well – that is, effectively, efficiently, and with satisfaction – and areas where the application failed to meet the needs of the participants. The data from this test may serve as a reference for enhancements to the tested functionality; and a baseline for future tests with an updated version meridianEMR.

During the current usability test for version 5.8, the system was evaluated for effectiveness, efficiency and satisfaction as expressed by each participant:

• Number of tasks successfully completed within the allotted time without assistance • Approximate time to complete the tasks • Number and types of errors • Path deviations • Participant’s verbalizations (comments) • Participant’s satisfaction ratings of the system

Additional information about the tasks performed and user comments can be found in the section on Usability Metrics later in this document.

Tasks The following tasks were constructed that would be realistic and representative of the kinds of activities a user might do with this EHR:

MeridianEMR 2014 Safety-Enhanced Design Report for a1, a2, a6, a7, b3

Page 24: Amit V. Trivedi

Page 8 of 18

• Document that the patient has no known drug allergies (NKDA) • Document medication allergies in the patient record • Add medication history to a patient’s record, including one free-text (not codified)

medication • Prescribe new medications, including an electronic prescription • View interactions triggered by the combination of allergies and medications • Address interaction warnings • Discontinue medications using two separate methods • View patient’s medication history • Carry out Computerized Provider Order Entry (CPOE) for medications, lab orders and

radiology orders

Tasks were selected based on their frequency of use, criticality of function to patient safety, and those that may be most troublesome for users to carry out efficiently and without error. They were ranked and chosen with consideration of the study objectives.

Procedures At the start of the session, participants were greeted and thanked for their time. Before the usability testing started, participants were given a handbook that explained the reason for the testing and that the system, and not the users’ abilities, was being tested (see Appendix 1).

To ensure that the test ran smoothly, two meridianEMR staff members participated in this test.

The administrator moderated the session, including providing instructions and tasks, and recorded the usability testing. A second person served as the data logger and took notes on task success, path deviations, type of errors, and comments.

A member of the HealthTronics Product Management team had designed the test and has previous professional training in user-centered interactive design and has had some involvement with usability testing in the past.

Participants were instructed to perform each task

• As quickly as possible making as few errors and deviations as possible • Without assistance; administrators were allowed to give immaterial guidance and

clarification on tasks, but not instructions on use.

For each task, the participants were given a written copy of the task. Following the session, the administrator posed questions to the participants and thanked each for their participation. Participants were informed ahead of time that they would be receiving a free hour of meridianEMR training.

MeridianEMR 2014 Safety-Enhanced Design Report for a1, a2, a6, a7, b3

Page 25: Amit V. Trivedi

Page 9 of 18 Test Location and Environment

Because of the small number of objectives being tested and the difficulty to secure users’ time to perform testing, the meridianEMR team decided to perform testing at a location and time convenient to the participants. Therefore, testing was performed during clinic hours at the users’ office. Users logged into a WebEx session with a meridianEMR staff member that was running the version of meridianEMR being tested on her computer. She gave the controls to the users, who were then able to walk through the tasks requested of them. The WebEx session was recorded for later reference.

Test Forms and Tools During the usability test, the following documents and instruments were used:

• Documented instructions and purpose for testing Appendix 1 • A list of tasks to carry out, repeated in the Tasks section and in the Executive Summary • WebEx, as described under Test Location and Environment above.

The participants’ interaction with the EHRUT was captured and recorded digitally, using WebEx. The data logger observed the session.

Participant Instructions Participants were provided instructions ahead of the usability test, and those instructions were reiterated verbally before the test began. Those instructions are documented in Appendix 1.

Usability Metrics According to the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records, EHRs should support a process that provides a high level of usability for all users. The goal is for users to interact with the system effectively, efficiently, and with an acceptable level of satisfaction. To this end, feedback regarding effectiveness, efficiency and user satisfaction was observed and captured during the usability testing.

The goals of the test were to assess:

1. Effectiveness of recording and prescribing medications in meridianEMR 2. Efficiency of carrying out the same tasks 3. Satisfaction with the ease of learning and using meridianEMR and the system’s

performance in carrying out these tasks

MeridianEMR 2014 Safety-Enhanced Design Report for a1, a2, a6, a7, b3

Page 26: Amit V. Trivedi

Page 10 of 18 Data Scoring

The following table details how tasks were scored, errors evaluated, and the time data analyzed.

Measures Rationale and Scoring Effectiveness:

Task Success A task was counted as “Successful” if the participant could achieve the correct outcome without assistance and with little hesitation. Task Success also indicates the task was carried out quickly – see Efficiency: Task Time below.

Effectiveness: Task Failure

A task was counted as a “Failure” if the participant did not know how to carry it out and could not figure out how without assistance.

Efficiency: Task Deviations

Deviations occur if the participant walked through the wrong path in an effort to carry out the assigned test.

Efficiency: Task Time

Task time was observed and recorded. Because the tasks were all carried out very quickly, however, task time was not quantitatively documented in minutes or means. An effectiveness measure of “Task Success” indicates the task was carried out quickly; if the task time was unsatisfactory (not efficient), the results note this.

Satisfaction: Verbal feedback

Participants’ subjective impression of the ease of use of meridianEMR was captured in response to simple post-task questions and discussion. This satisfaction is documented in the MAJOR FINDINGS and AREAS FOR IMPROVEMENT sections later in this document.

MeridianEMR 2014 Safety-Enhanced Design Report for a1, a2, a6, a7, b3

Page 27: Amit V. Trivedi

Page 11 of 18

Results

Data Analysis and Reporting The results of the usability test are detailed below. The results should be seen in light of the objectives and goals outlined in Study Design. The data should yield actionable results that, if corrected, yield material, positive impact on user performance.

Task Clinical Team Member Participant

Clinical Team Member Participant

Document that the patient has No Known Drug Allergies (NKDA)

Successful Successful

Document medication allergies in the patient record

Successful Successful

Add medication history to a patient’s record, including one free-text (not codified) medication

Successful

Failure: unable to remember how to free-text medication

Prescribe new medications and submit electronically

Successful Successful

View interactions triggered by the combination of allergies and medications

Successful Successful

Address interaction warnings Successful Successful Discontinue medications using two separate methods

Successful Successful

View patient’s medication history

Successful Successful

Enter CPOE for medications, lab orders and radiology orders

Successful Failure: unable to remember how to indicate CPOE

MeridianEMR 2014 Safety-Enhanced Design Report for a1, a2, a6, a7, b3

Page 28: Amit V. Trivedi

Page 12 of 18 Discussion of the Findings

EFFECTIVENESS Participants were able to carry out tasks were typically part of their daily workflow successfully and efficiently. Participants were slowed down and showed hesitance if asked to carry out a task that someone else in the office would typically carry out, such as a clinical user being asked to enter free text medication, when that is not in her normal scope of work.

These tasks caused some confusion and hesitation on behalf of the users:

• Create a free-text medication; and • Understanding how meridianEMR will calculate the Core 1 measure to differentiate between

electronic orders that do and do not fit the CMS definition of CPOE

The purpose of asking the customers to enter a free-text medication was to gauge their responses to not seeing that medication in an interaction check. The results did not come out as hoped, however, because the customers apparently do not enter free-text medications because they want to see interaction checks. This was a fault of the test but a positive outcome of the results. It suggests that users find it easy to enter properly coded medications as the system is designed; they do not find a need to “short cut” the medication search by keying medications into the system manually.

EFFICIENCY As indicated under Effectiveness above, users carried out familiar tasks more quickly than unfamiliar tasks, which was as expected.

Addressing the usability issues listed as bullet points under EFFECTIVENESS above should enable users to gain efficiency as well.

SATISFACTION All of the users tested expressed satisfaction with meridianEMR in general and the tasks that were asked of them in particular, with two exceptions:

• We created a way for providers to see which XRay and Lab orders will count towards CPOE in the numerator, and we gave the opportunity for a provider to indicate if an order is not CPOE under the CMS requirements for the Core 1 measure under the 2014 Meaningful Use requirements. This new functionality very confusing to the users during preparation for the test, and they suggested changing it, even though the functionality was designed to provide passive notification and not get in the users’ way.

MeridianEMR 2014 Safety-Enhanced Design Report for a1, a2, a6, a7, b3

Page 29: Amit V. Trivedi

Page 13 of 18 MAJOR FINDINGS

• Not surprisingly, tasks were easier for users who carry them out regularly. For example, one clinical staff member entered medication history more quickly than another, because of their daily clinical responsibilities.

• Users tested had a hard time following the instruction to enter a “free-text medication.” • Users did not display or verbalize any challenges with NKDA or medication allergies or reviewing

interaction warnings. • Users generally understood what CPOE and knew how to enter orders, but they were very

confused by the CMS definition when we shared it with them during the introduction to the test.

AREAS FOR IMPROVEMENT • None of the users documented start date for allergies, and only one documented a reaction.

This suggests some need for user education regarding the value of this information in the EMR and for data exchange. If customers agree that this information is important, we need to evaluate whether to include a way to enter this information when the user is selecting an allergen (currently, a user has to click twice more to add the information after the allergen has been captured).

• Functionality related to how meridianEMR calculates the numerator for CPOE, which is based on the CMS definition of CPOE, needs to be carefully documented by our Technical Writing team and communicated by our Services and Support teams, so that our users understand how the measure is calculated and how their workflow needs to be adjusted in order to accommodate this (for example, getting out of the habit of creating orders on paper and entering them into the system after the fact).

MeridianEMR 2014 Safety-Enhanced Design Report for a1, a2, a6, a7, b3

Page 30: Amit V. Trivedi

Page 14 of 18

Appendices

Appendix 1: Instructions and Introduction Provided to Customers Before or At the Start of Usability Testing

Please walk through the tasks below in the provided test environment, following your typical workflow in day-to-day use of meridianEMR. The tasks listed in this document are updates to current workflows already existing in meridianEMR.

The session today should last no more than about an hour. During that time, you’ll be asked to carry out a series of familiar tasks in meridianEMR, and the test will be observed and recorded for later analysis by meridianEMR Product Management. (The recording and your identity will not be shared outside of HealthTronics.) We are testing the system, not your ability to carry out the tasks we’ll ask of you in this testing session.

Be aware that you are working in an in-progress version of meridianEMR. The functionality you are about to test, however, is considered “done” based on internal coding/testing.

Please feel free to report concerns and issues, both verbally and using the questionnaire at the end of this document.

You may withdraw from usability testing at any time.

Thank you for your valuable time and feedback!

MeridianEMR 2014 Safety-Enhanced Design Report for a1, a2, a6, a7, b3

Page 31: Amit V. Trivedi

Page 15 of 18 Appendix 2: Feedback Requested from Customers Below are the questions submitted to customers to comment upon after usability testing.

Please provide verbal or written feedback on how easy (or how difficult) this system is to use, what in it would be useful to you, and how we could improve it. The questions below are guidelines – you may fill them out or give feedback verbally.

Although your responses will be shared with CCHIT as part of the meridianEMR MU2 certification, your identity will be anonymous.

• What are your overall impressions of meridianEMR from this testing and in general: • What aspects of the system that you tested today do you like the most? • What aspects of the system that you tested today do you like the least? • Were there any features you were surprised to see? • Were there any features you expected to see but did not? • Did you find any of today’s steps unnecessarily complex? • How much training do you believe would be necessary before a new user carry out the

tasks you carried out today? • Do you have other feedback and comments you’d like to share today?

Appendix 3: meridianEMR User-Centered Design Process Below is provided a description of the meridianEMR user-centered design process used to create the functionality being presented for certification and discussed in this report.

meridianEMR was founded in 2003 is an EHR with development efforts focused on urology. The user workflows for recording and accessing functionality evaluated during certification of these objectives –

o 170.314(a)(1): Computerized Provider Order Entry o 170.314(a)(2): Drug-drug, drug-allergy interaction checks o 170.314(a)(6): Medication List o 170.314(a)(7): Medication Allergy List o 170.314(b)(3): Electronic Prescribing

as well as the rest of the 2014 Meaningful Use changes in meridianEMR – are based on functionality and workflow originally conceived of by a practicing urologist. Customers and clinical and regulatory advisors are also consulted in changes to the application. The meridianEMR Product Manager plans the content of releases by evaluating the following:

• Patient Safety-Related issues (typically fixed on HotFix Release)

MeridianEMR 2014 Safety-Enhanced Design Report for a1, a2, a6, a7, b3

Page 32: Amit V. Trivedi

Page 16 of 18

• Regulatory requirements • Strategic Features (typically “Roadmap” features that improve physician and clinical staff

efficiency) • Technical Release Needs • Feature requests from customers are reviewed to determine:

o Requests which are most frequently received across customer base o Requests which can be scoped to fit release needs o Requests which can have big impact on users (even if not widely requested)

• Internal “polling” occurs to identify top requests by Implementation and Sales. • If needed, evaluate if any special client handling of requests needed. • Defects reviewed with support to verify highest priority defects are marked as

Critical/High priority for release.

Each feature is then designed as described by the process summarized below.

The meridianEMR Product Management team includes Design Analysts trained in human factors engineering, usability, and user-centered design, including experience using primary and secondary personas to better understand and communicate the users’ needs. The team is staffed by a Director of Product Management and several Business Analysts who serve as Product Owners for different areas of the software. When the Product Manager assigns a feature to a Product Owner, the Product Owner discusses the request with customers and other stakeholders to understand the business problem. When possible, the Design Analyst visits customer sites to discuss the business need with physicians, clinical staff and back-office personnel, as well as observing the users’ surroundings for clues into what the users need to have easily at hand. For example, paper or sticky notes posted on and around a users’ computer monitor, and files that are kept on a desk, give valuable clues into the types of information that need to be immediately accessible in the software. Additionally, any computer tasks performed outside of the EMR (such as in Excel spreadsheets) or through repeated manual tasks suggest areas where workflow in the software can be improved and efficiencies gained for users. In short, these visits help the Design Analyst gain a greater understanding of the user needs, workflows, and work environments, in order to identify and prioritize the actors and use cases that must be addressed by this feature.

The Design Analyst also discusses the feature and findings with internal stakeholders, such as educators that may have additional insight from their own customer visits; and technical staff that can advise about approaches that will be most efficient with the least impact on performance. At times when an on-site visit is not feasible, the Design Analyst will enhance this research by discussing the business problems and potential solutions on phone calls with customers.

MeridianEMR 2014 Safety-Enhanced Design Report for a1, a2, a6, a7, b3

Page 33: Amit V. Trivedi

Page 17 of 18

During and after this research, the Design Analyst mocks up a rough user interface concept, either in wireframe or using MS Paint or Adobe Photoshop, and creates documentation that may also include user workflow descriptions and acceptance criteria. The purpose of this documentation is to communicate the user acceptance criteria, visual concepts, and other requirements to the Development team and other stakeholders. The user interface concepts are based on human behavior principles and existing design patterns in MeridianEMR. Because it is important for the user interface to look intuitive and familiar, the user’s workflow on the new or updated screen is carefully analyzed to help make it easy to learn, use, and remember. Where a new feature is being introduced to replace a paper process, the user interface is designed as close as possible to users’ mental model for that process. The design team’s goals include helping users be successful while avoiding intrusive, unnecessary pop ups or extra clicks. When a certain area of the product is being updated, effort is made to incorporate other enhancements that users have requested, when such enhancements will not significantly grow scope and can be addressed properly in the timeline of the project.

meridianEMR is developed following an Agile with Scrum methodology, with each Scrum team including Developers, Quality Assurance Analysts, a Design Analyst, and a technical writer to create the raw documentation that will be used for customer training, release summaries, and internal technical reference. The Design Analyst has daily involvement with the Development team, which reviews the Design Analyst’s concept and acceptance criteria and negotiates changes if the original concept is not technically feasible. At the end of each sprint, the stories developed and tested during the sprint are reviewed by the Design Analyst and signed off if they meet the acceptance criteria; and a demo is presented to Product Management to confirm that the functionality meets the business problems. Changes requested during the demo will be prioritized and either completed during the next sprint or deferred until a later release.

Approaching Beta testing, the Quality Assurance and Development teams are “all hands on deck” to re-test all stories included in the release to verify nothing has broken and to fix any defects found. During this regression-testing period, Product Management and other internal stakeholders also perform user acceptance testing to confirm accuracy and usability of the release functionality. When this regression period is over, the software is released to the first set of Beta customers. Initial customer use of new and updated features is observed during Beta testing, and any defects in code and design reported during Beta are carefully considered and either addressed immediately or scheduled for an upcoming release. Once releases become Generally Available and are installed at a greater number of customer sites, Support and Product Management receive and monitor feature requests received from customers. Requests related to patient safety are prioritized at the top of the Maintenance team’s list and addressed as quickly as possible, typically through hot fixes released between regular feature and maintenance releases.

MeridianEMR 2014 Safety-Enhanced Design Report for a1, a2, a6, a7, b3

Page 34: Amit V. Trivedi

Page 18 of 18 Appendix 4: Additional Plans for Usability Testing to Meet the 170.314(g)(3) Certification Requirements

This is the first of two usability testing studies planned for meridianEMR to meet the certification requirements for 170.314(g)(3) under the 2014 EHR Certification final rule. Testing of the remaining objectives required for (g)(3) Safety-Enhanced Design (170.314(a)(8) Clinical Decision Support and 170.314(b)(4) Clinical Information Reconciliation will be performed during the first half of 2014 in meridianEMR version 5.9. The results of that test will be documented separately and submitted during a later wave of certification for 2014 Meaningful Use.

MeridianEMR 2014 Safety-Enhanced Design Report for a1, a2, a6, a7, b3

Page 35: Amit V. Trivedi

Page 1 of 16

meridianEMR 2014 Safety-Enhanced Design Report for a1, a8, b4

EHR Usability Test Report of meridianEMR Version 7.0

Report applies to:

170.314(a)(1): Computerized Provider Order Entry

170.314(a)(8): Clinical Decision Support

170.314(b)(4): Clinical Information Reconciliation

Report based on ISO/IEC 25062:2006 Common Industry Format for Usability Test Reports

meridianEMR version 7.0

Date of Usability Test: June 18, 2014 Date of Report: June 19, 2014 Report Prepared By: meridianEMR Product Management Joanne Finn, Senior Technical Writer [email protected] 1701 S. Enterprise, Suite 104 Springfield, MO 65804

Contents EXECUTIVE SUMMARY .................................................................................................................................. 3

Major findings ........................................................................................................................................... 5

Areas for Improvement ............................................................................................................................. 5

Introduction .................................................................................................................................................. 6

Method ......................................................................................................................................................... 7

Participants ............................................................................................................................................... 7

Study Design ............................................................................................................................................. 7

Tasks .......................................................................................................................................................... 7

Procedures ................................................................................................................................................ 8

Test Location and Environment ................................................................................................................ 9

Page 36: Amit V. Trivedi

Page 2 of 16

meridianEMR 2014 Safety-Enhanced Design Report for a1, a8, b4

Test Forms and Tools ................................................................................................................................ 9

Participant Instructions ............................................................................................................................. 9

Usability Metrics ....................................................................................................................................... 9

Data Scoring .................................................................................................................................................. 9

Results ......................................................................................................................................................... 11

Data Analysis and Reporting ................................................................................................................... 11

Discussion of the Findings ....................................................................................................................... 12

EFFECTIVENESS ................................................................................................................................... 12

EFFICIENCY .......................................................................................................................................... 12

SATISFACTION ..................................................................................................................................... 12

Major findings ......................................................................................................................................... 13

Areas for Improvement ........................................................................................................................... 13

Appendices .................................................................................................................................................. 14

Appendix 1: Instructions Provided to Customers at the Start of Usability Testing ................................ 14

Computerized Provider Order Entry (CPOE) ............................................................................................ 14

Clinical Decision Support ......................................................................................................................... 15

Clinical Information Reconciliation .......................................................................................................... 15

Appendix 2: Additional Feedback about Other System Behavior ........................................................... 15

Page 37: Amit V. Trivedi

Page 3 of 16

meridianEMR 2014 Safety-Enhanced Design Report for a1, a8, b4

EXECUTIVE SUMMARY A usability test of meridianEMR version 7.0, a urology EHR, was conducted remotely on June 18,

2014, with a clinical user of meridianEMR who would typically use the functionality being tested.

The purpose of this test was to review and validate the usability of the current user interface and to

provide evidence of usability in the EHR Under Test (EHRUT).

During the usability test, one clinical user matching the target demographic criteria for CPOE, Clinical

Decision Support, and Clinical Information Reconciliation served as participant and used the EHRUT

in simulated, but representative tasks.

This study collected performance data on the following tasks typically conducted using an EHR:

Carry out Computerized Provider Order Entry (CPOE) for medications (prescriptions),

laboratory orders, and radiology orders

Enable one or more Clinical Decision Support interventions in the Administration area, and

add documentation to patient record that would prompt Clinical Decision Support

interventions

Using the Health Maintenance (HME) process , observe the Clinical Decision Support alerts,

recommendations, and system-generated tasks related to each recommendation

Incorporate a CCDA document using role-based security restricted functionality and attach it

to a patient’s chart

Perform Clinical Information Reconciliation to reconcile the data in the CCDA with existing

data in the system for the selected patient

Note: Due to feedback we received about CPOE, we removed CPOE from the first testing wave and

revised some UI behavior to make the process clearer to the end user. We found in this round of

testing that it was much easier for the user to know when CPOE applies to the orders being entered

into the system and when they do not.

During the 45-minute one-on-one usability test, the participant was greeted by the administrator,

was provided with information about the reasons and goals of the test ahead of time, and was

instructed that they could withdraw at any time. The participant had prior experience with the EHR.

The administrator introduced the test, and instructed the participant to complete a series of tasks

(given one at a time) using the EHRUT. During the testing, the administrator timed the test and,

along with the data logger(s), recorded user performance data on paper and electronically. The

administrator did not give the participant assistance in how to complete the task.

Participant screens and audio were recorded for subsequent analysis.

The following types of data were collected for each participant:

Page 38: Amit V. Trivedi

Page 4 of 16

meridianEMR 2014 Safety-Enhanced Design Report for a1, a8, b4

Number of tasks successfully completed within the allotted time without assistance

Approximate time to complete the tasks

Number and types of errors

Path deviations

Participant’s verbalizations (comments)

Participant’s satisfaction ratings of the system

All participant data was de-identified in this report – no correspondence could be made from the

identity of the participant to the data collected. Following the conclusion of the testing, the

participant was asked to provide additional verbal feedback about these and other areas of the

system. Below is a summary of the performance and rating data collected on the EHRUT. It is

collected from verbal feedback about the testing steps and MeridianEMR, as well as observations

made by the MeridianEMR staff members that participated in the test.

Task Registered Nurse Average Time Spent

Carry out Computerized Provider Order Entry (CPOE) for medications (prescriptions), laboratory orders, and radiology orders

Successful Time spent: 5.5 Minutes

Average time: 5.5 Minutes Results were as expected for this user sample.

Enable one or more Clinical Decision Support interventions in the Administration area, and add documentation to patient record to observe Clinical Decision Support interventions

Successful Time spent: 2 Minutes

Average time: 2 Minutes Results were as expected for this user sample.

Using the Health Maintenance (HME) process , observe the Clinical Decision Support alerts, recommendations, and system-generated tasks related to each recommendation

Deviation Time spent: 7.5 Minutes Deviation: Clinical Decision Support notifications were not generated. The participant was unable to see the CDS recommendations.

Average time: 7.5 Minutes Results were not as expected for this user sample.

Incorporate a CCDA document using role-based security restricted functionality and attach it to a patient’s chart

Successful Time spent: 5 Minutes Results were as expected for this user sample.

Average time: 5 Minutes Results were as expected for this user sample.

Perform Clinical Information Reconciliation to reconcile the data in the CCDA with existing data in the system for the selected patient

Successful Time spent: 7.25 minutes Results were as expected for this user sample.

Average time: 7.25 Minutes Results were as expected for this user sample.

Page 39: Amit V. Trivedi

Page 5 of 16

meridianEMR 2014 Safety-Enhanced Design Report for a1, a8, b4

Major findings Health Maintenance items did not display in the patient chart after processing them and

refreshing the window.

After adding a new patient and processing Health Maintenance items, these items did not

display in the patient chart.

The participant really liked the Clinical Information Reconciliation process.

Areas for Improvement During the Clinical Information Reconciliation process, the source of the information should be

more obvious. Additionally, documentation should contain the different CCDAs supported by

meridianEMR as well as the use cases supported by those documents.

Short-term - Provide documentation on the functionality to trigger HMEs.

Long term – Improve the triggers when there is absence of information.

Page 40: Amit V. Trivedi

Page 6 of 16

meridianEMR 2014 Safety-Enhanced Design Report for a1, a8, b4

Introduction The EHRUT tested for this study was MeridianEMR, version 7.0. Designed to present medical

information to urologists and their staffs, the EHRUT is an EHR used exclusively in ambulatory

urology practices and including the typical functions available in such systems, including the ability

to view, capture, and update medical history and office visits, with interfaces to outside Practice

Management and Billing systems. The usability testing attempted to represent realistic exercises

and conditions.

The purpose of this study was to test and validate the usability of the current user interface, and

provide evidence of usability in the EHR Under Test (EHRUT). To this end, measures of

effectiveness, efficiency and user satisfaction, such as ease of using Clinical Information

Reconciliation and Clinical Decision Support interventions (i.e., Health Maintenance items), were

captured during the usability testing.

Page 41: Amit V. Trivedi

Page 7 of 16

meridianEMR 2014 Safety-Enhanced Design Report for a1, a8, b4

Method

Participants One participant was tested on the EHRUT(s). This participant is a clinical assistant and MeridianEMR

trainer in their practice.

The participant was recruited by MeridianEMR staff and had no direct connection to the

development of, or organization producing, the EHRUT(s). The participant was not from the testing

or supplier organization.

The participant was scheduled for one 45-minute session. The following table shows the information

about the participant:

Gender Age Occupation/Role Professional Experience

Computer Experience

Product Experience

Female 50+ Registered Nurse/ Clinical Director 10+ years 5+ years 4+ years with meridianEMR

Study Design Overall, the objective of this test was to uncover areas where the application performed well – that

is, effectively, efficiently, and with satisfaction – and areas where the application failed to meet the

needs of the participant. The data from this test may serve as a reference for enhancements to the

tested functionality; and a baseline for future tests with an updated version of MeridianEMR.

During the current usability test for version 7.0, the system was evaluated for effectiveness,

efficiency, and satisfaction as expressed by the participant:

Number of tasks successfully completed within the allotted time without assistance

Approximate time to complete the tasks

Number and types of errors

Path deviations

Participant’s verbalizations (comments)

Participant’s satisfaction ratings of the system

Additional information about the tasks performed and user comments can be found in the section

on Usability Metrics later in this document.

Tasks The following tasks were constructed that would be realistic and representative of the kinds of

activities a user might do with this EHR:

Page 42: Amit V. Trivedi

Page 8 of 16

meridianEMR 2014 Safety-Enhanced Design Report for a1, a8, b4

Carry out Computerized Provider Order Entry (CPOE) for medications (prescriptions),

laboratory orders, and radiology orders

Enable one or more Clinical Decision Support interventions in the Administration area, and

add documentation to patient record to observe Clinical Decision Support interventions

Incorporate a CCDA document using role-based security restricted functionality and attach it

to a patient’s chart

Perform Clinical Information Reconciliation to reconcile the data in the CCDA with existing

data in the system for the selected patient

Tasks were selected based on their frequency of use, criticality of function to patient safety, and

those that may be most troublesome for users to carry out efficiently and without error. They were

ranked and chosen from least to most difficult to use in this test, which is not the typical way of

designing the test. In this case, however, the second and third groups of steps required some

education and introduction, and the testers wanted to get the easy, more familiar testing of CPOE

over with first, so that the customers would not be discouraged by starting with the more

challenging tasks.

Procedures At the start of the session, the participant was greeted and thanked for their time. Before the

usability testing started, the participant was given a handbook that explained the reason for the

testing and that the system, and not the users’ abilities, was being tested (see Appendix 1).

To ensure that the test ran smoothly, two MeridianEMR staff members participated in this test.

The administrator moderated the session, including providing instructions and tasks, and recorded

the usability testing. The administrator, a member of the MeridianEMR Product Management team,

had designed the test and has previous professional training in user-centered interactive design and

has had some involvement with usability testing in the past. A second person served as the data

logger and took notes on task success, path deviations, type of errors, and comments.

The participant was instructed to perform each task:

As quickly as possible making as few errors and deviations as possible

Without assistance; administrators were allowed to give immaterial guidance and

clarification on tasks, but not instructions on use.

For each task, the participant was given a written copy of the task. Following the session, the

administrator posed questions to the participant and thanked them for their participation.

Page 43: Amit V. Trivedi

Page 9 of 16

meridianEMR 2014 Safety-Enhanced Design Report for a1, a8, b4

Test Location and Environment Because of the small number of objectives being tested, and the difficulty to secure users’ time to

perform testing, the MeridianEMR team decided to perform testing at a location and time

convenient to the participant. Therefore, testing was performed during clinic hours at the user’s

office. The user logged into a WebEx session with a MeridianEMR staff member that was running

the version of MeridianEMR being tested on her computer. She gave the controls to the user, who

was then able to walk through the tasks requested of them. The WebEx session was recorded for

later reference.

Test Forms and Tools During the usability test, the following documents and instruments were used:

Documented instructions and purpose for testing Appendix 1

A list of tasks to carry out, repeated in the Tasks section and in the Executive Summary

WebEx, as described under Test Location and Environment above.

The participant’s interaction with the EHRUT was captured and recorded digitally, using WebEx. The

data logger observed the session.

Participant Instructions The participant was provided instructions ahead of the usability test, and those instructions were

reiterated verbally before the test began. Those instructions are documented in Appendix 1.

Usability Metrics According to the NIST Guide to the Processes Approach for Improving the Usability of Electronic

Health Records, EHRs should support a process that provides a high level of usability for all users.

The goal is for users to interact with the system effectively, efficiently, and with an acceptable level

of satisfaction. To this end, feedback regarding effectiveness, efficiency and user satisfaction was

observed and captured during the usability testing.

The goals of the test were to assess:

1. Effectiveness of recording and prescribing medications in MeridianEMR

2. Efficiency of carrying out the same tasks

3. Satisfaction with the ease of learning and using MeridianEMR and the system’s performance

in carrying out these tasks

Data Scoring The following table details how tasks were scored, errors evaluated, and the time data analyzed.

Page 44: Amit V. Trivedi

Page 10 of 16

meridianEMR 2014 Safety-Enhanced Design Report for a1, a8, b4

Measures Rationale and Scoring

Effectiveness: Task Success

A task was counted as “Successful” if the participant could achieve the correct outcome without assistance and with little hesitation. Task Success also indicates the task was carried out quickly – see Efficiency: Task Time below.

Effectiveness: Task Failure

A task was counted as a “Failure” if the participant did not know how to carry it out and could not figure it out without asking for assistance.

Efficiency: Task Deviations

Deviations occur if the participant walked through the wrong path or struggled in an effort to carry out the assigned test.

Efficiency: Task Time

Task time was observed and recorded. Because the tasks were all carried out very quickly, however, task time was not quantitatively documented in minutes or means. An effectiveness measure of “Task Success” indicates the task was carried out quickly, with little hesitation.

Satisfaction: Verbal feedback

Participant’s subjective impression of the ease of use of MeridianEMR was captured in response to simple post-task questions and discussion. This satisfaction is documented in the MAJOR FINDINGS and AREAS FOR IMPROVEMENT sections later in this document.

Page 45: Amit V. Trivedi

Page 11 of 16

meridianEMR 2014 Safety-Enhanced Design Report for a1, a8, b4

Results

Data Analysis and Reporting The results of the usability test are detailed below. The results should be seen in light of the objectives

and goals outlined in Study Design. The data should yield actionable results that, if corrected, yield

material, positive impact on user performance.

Task Registered Nurse Average Time Spent

Carry out Computerized Provider Order Entry (CPOE) for medications (prescriptions), laboratory orders, and radiology orders

Successful Time spent: 5.5 Minutes Results were as expected for this user sample.

Enable one or more Clinical Decision Support interventions in the Administration area, and add documentation to patient record to observe Clinical Decision Support interventions

Successful Time spent: 2 Minutes Results were as expected for this user sample.

Using the Health Maintenance (HME) process , observe the Clinical Decision Support alerts, recommendations, and system-generated tasks related to each recommendation

Deviation Deviation: Clinical Decision Support notifications were not generated. The participant was unable to see the CDS recommendations.

Time spent: 7.5 Minutes Results were not as expected for this user sample.

Incorporate a CCDA document using role-based security restricted functionality and attach it to a patient’s chart

Successful Results were as expected for this user sample.

Time spent: 5 Minutes Results were as expected for this user sample.

Perform Clinical Information Reconciliation to reconcile the data in the CCDA with existing data in the system for the selected patient

Successful Results were as expected for this user sample.

Time spent: 7.25 minutes Results were as expected for this user sample.

Page 46: Amit V. Trivedi

Page 12 of 16

meridianEMR 2014 Safety-Enhanced Design Report for a1, a8, b4

Discussion of the Findings

EFFECTIVENESS

The following tasks caused some confusion and hesitation on behalf of the user:

User did not have much experience at using HME. While the user did not have much

experience at using HME, she was able to test the Clinical Decision Support area through the

Health Maintenance (HME) process with minimal guidance from the administrator.

Clinical Decision Support notifications were not generated. The participant was unable to

see the CDS recommendations. Currently, the notifications for the patient’s problem list

require the user to enter at least one problem in order for the CDS Problem List rule to be

triggered when the filter is set up for the absence of a diagnosis. Since the absence of all

diagnoses for a patient is not a likely scenario, we expect this to be encountered only in a

testing situation. However, we will consider an improvement for this feature in a later

release. In the meantime, we will provide very clear user documentation for this

functionality.

EFFICIENCY

Below is a summary of the user feedback about improvements that would increase efficiency:

Health Maintenance items did not display in the patient chart after processing them and

refreshing the window. After adding a new patient and processing Health Maintenance

items, these items did not display in the patient chart.

During the testing of Clinical Information Reconciliation, the user was unsure of the

document type to import. Users need something that reminds them what document types

are what so they can select the correct document to import.

Document viewer does not currently support this document type. This error displayed

when the user selected the document to import during the testing of Clinical Information

Reconciliation. The administrator informed the participant that this is an issue with the

environment we used.

Addressing the usability issues listed as bullet points under EFFECTIVENESS above should enable

users to gain efficiency as well.

SATISFACTION

Some comments from the customers suggested opportunities to improve MeridianEMR user

satisfaction:

Changing a patient’s gender and refreshing takes too long. Because the gender defaults to

female when adding a patient, if the user forgets to change it to male and then later needs

to change it in the chart, it takes too long to change after refreshing the screen. In fact, it is

sometimes necessary to close the patient chart and then reopen it to refresh the gender.

Page 47: Amit V. Trivedi

Page 13 of 16

meridianEMR 2014 Safety-Enhanced Design Report for a1, a8, b4

Users would like to see the default be male when creating a new patient. Additionally, if the

gender does need to be changed, users would like the change to occur more quickly without

having to close and reopen the patient chart.

Major findings

Health Maintenance items did not display in the patient chart after processing them and

refreshing the window.

After adding a new patient and processing Health Maintenance items, these items did not

display in the patient chart.

The participant really liked the Clinical Information Reconciliation process and thought it

was “Cool.”

Areas for Improvement During the Clinical Information Reconciliation process, the source of the information should

be more obvious. Additionally, documentation should contain the different CCDAs

supported by meridianEMR as well as the use cases supported by those documents.

Short-term - Provide documentation on the functionality to trigger HMEs.

Long term – Improve the triggers when there is absence of information.

Page 48: Amit V. Trivedi

Page 14 of 16

meridianEMR 2014 Safety-Enhanced Design Report for a1, a8, b4

Appendices

Appendix 1: Instructions Provided to Customers at the Start of Usability

Testing Today, we will test how easy or difficult it is to use certain areas of Meridian mEMR that impacts patient

safety. It is not a test of your ability to use the system.

The areas covered by this test are defined by the government as part of Meaningful Use 2014 EHR

certification requirements.

The test session will be recorded for analysis and reference, but the recording will not be shared outside

of HealthTronics IT. The results will be written in a report without using your names. The report is

submitted as part of the Meridian mEMR MU2 certification.

Although we are only covering a limited area of Meridian mEMR during this test, your feedback about

other areas of the system is very welcome.

The test should take no more than an hour. You are welcome to stop testing at any time.

Thank you for your help!

At start of meeting, HealthTronics team member will:

Log into the Meridian mEMR test environment as user/password.

Pass control to you.

Areas to be tested:

Computerized Provider Order Entry (CPOE)

Clinical Decision Support

Clinical Information Reconciliation

Computerized Provider Order Entry (CPOE)

1. Open up test patient Charles CPOE.

2. Enter a “medication order” – prescription – for this patient.

3. Enter a radiology order for this patient.

4. Enter a lab test order for this patient.

Note: We have added the ability to distinguish between radiology and labs in Plan Item Maintenance.

Page 49: Amit V. Trivedi

Page 15 of 16

meridianEMR 2014 Safety-Enhanced Design Report for a1, a8, b4

Clinical Decision Support

Clinical Decision Support (CDS) for Meridian mEMR is conducted through our Health

Maintenance (HME) process. We have added the ability to indicate which HME rules apply to

CDS.

Administrative set up workflow:

1. Go to HME in Maintenance

2. Set up a new HME rule and indicate that it is for CDS or edit an existing rule

User workflow:

1. The user workflow for HME has not changed. As you enter patient information, the CDS

information is triggered for you automatically.

2. Based upon your user role, you will be able to see the alert as you work in the patient

chart.

Clinical Information Reconciliation

Clinical Information Reconciliation describes the process of receiving Medications, Allergies, and

Problems from a referring provider and electronically adding them to the patient record.

In order to provide this functionality, we enhanced the Continuity of Care section of the

application to include a Clinical Information Reconciliation (CIR) option on the select menu.

The new workflow is as follows:

1. Open a patient chart

2. Select the Continuity of Care window

3. Import a Transition of Care (TOC) document

4. Select the CIR option for document from the selection menu list

5. Conduct the reconciliation

Appendix 2: Additional Feedback about Other System Behavior The testing instructions in Appendix 1 included an invitation to provide other feedback that may

improve the user experience in MeridianEMR. Below is a summary of this feedback:

When adding a patient, the gender defaults to female. Users would like to see the default be

male.

When changing a patient’s gender and then refreshing, the gender does not change as

quickly as users would like it to change. It is sometimes necessary to close the patient chart

and then reopen it to view the change. Users would like this to occur more quickly without

having to close and reopen the patient chart.

During the Clinical Information Reconciliation testing, the user was confused as to how to

find out where the information came from? The administrator informed the participant that

the Clinical Information Reconciliation window during the reconciliation process showed

where the information came from.

Page 50: Amit V. Trivedi

Page 16 of 16

meridianEMR 2014 Safety-Enhanced Design Report for a1, a8, b4

The administrator showed the participant the new Meaningful Use Dashboard and the

participant liked that there was more information on the screen and easier to see their

Meaningful Use status.