User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and...

50
VARIAN MEDICAL SYSTEMS User Centered Design Process Attestation document for §170.314(g)(3) Safety-enhanced design

Transcript of User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and...

Page 1: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

VARIAN MEDICAL SYSTEMS

User Centered Design Process Attestation document for §170.314(g)(3) Safety-enhanced design

Page 2: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

Varian Medical Systems | Attestation document for §170.314(g)(3) Safety-enhanced design 1

VarianMedicalSystemsUserCenteredDesignProcess

Varian User Centered-Design Process (UCD) has been integrated into the Varian Oncology Systems (VOS) Product

Lifecycle (PLC) process. The UCD process is intended to achieve enhanced usability of Varian products, which in turn is

intended to minimize use errors and use-associated risks. This leads to better results in delivering safer and more

satisfying products to Varian's customers and users.

Varian’s UCD process of involving users directly throughout the entire product lifecycle. The process is inherently

iterative - enabling product teams to:

1. Understand users’ goals in their context of use;

2. Create and refine product designs across well-defined phases of work,

3. Validate requirements and designs with users to ensure that their needs are met.

The Varian UCD process involves iterative users' activities that inform the whole process. These activities are highly

collaborative, ideally including User Experience Designers (UXD), Product Managers (PdMs), Engineers (ENG), and end

users.

The User-Centered Design process is executed as part of the Design and Development process. The UCD activities are

divided into four functional phases:

• Plan - a preliminary phase

1. Research

2. Model

3. Design

4. Adapt

• Assess - a transitional phase

The functional phases of Varian UCD are applicable to all Varian development. They are integrated into the PLC phases

(see Appendix A) of Inception, Elaboration, and Construction. This integration ensures that the UCD process is part of

the PLC framework and that project teams follow usability engineering guidelines for medical devices (IEC 62366), the

FDA Human Factors Guidance, and related ISO standards (ISO 9241-210:2010).

Page 3: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

Varian Medical Systems | Attestation document for §170.314(g)(3) Safety-enhanced design 2

Preliminary Phase: Plan

At this point of the development process, the first step is to look at the development project and identify the UCD effort

level, i.e., the UCD activities and deliverables required for the project as defined by the Project Scaling Work Instruction.

Phase 1: Research

The Research phase relies on iterative user research activities to understand users, their needs, goals, tasks, and pain

points. The timely alignment of this phase with the PLC Inception phase, allows PdMs to leverage the information

gathered from user research activities in drafting the project requirements document.

Phase 2: Model

The Model phase is a synthesis phase focused on transferring the understanding of users and their needs to UI design.

The goal of this modeling phase is to define the user interaction with the future system without designing it. The timely

alignment of this phase with PLC Inception phase allows PdMs to leverage the deliverables of this phase in drafting the

project requirements document.

Phase 3: Design

In the Design phase, which coincides with the beginning of the PLC Elaboration phase, the main goal is to not get locked

into a single design solution too early. Therefore, this phase is broken into two stages; “conceptual” design stage and

“detailed” design stage. The first stage is characterized by low-fidelity designs and the second phase by mid-fidelity

designs.

Phase 4: Adapt

In Adapt phase, which coincides with the beginning of the PLC Construction phase, the focus shifts from designing the UI

to ensuring that the UI design gets implemented according to the UI specification. As the system is built/coded, the UX

designer may have to adapt the design to address any unforeseen limitations in the target technology, new

requirements, or missing functionality in the initial design. Throughout this phase, UXD, ENG, and PDM collaborate

closely to ensure what’s getting adapted still meets user needs.

Transitional Phase: Assess

After the product is built and just before release, a formal Summative Usability Test (e.g., Benchmark Usability Test) is

conducted with users to assess the success of the product. At Varian, success is measured with the Usability Key

Performance Indicator (KPI), which is a combined score of the following benchmark metrics:

• Effectiveness - How well the system enables the user to accomplish tasks and goals.

• Efficiency - How much effort and time it takes to accomplish tasks.

• Use-safety - How free an environment (including devices, software, facilities, people, etc.) is from danger,

risk, and injury.

• Satisfaction - The users’ perception and subjective reaction to accomplishing tasks.

Page 4: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

Varian Medical Systems | Attestation document for §170.314(g)(3) Safety-enhanced design 3

Appendix A - PLC Overview

Inception phase

The goal of the Inception Phase is to achieve concurrence among all stakeholders on the objectives for the device and

project by defining intended use requirements which reflect customer needs, as well as initiating the various

project/device plans.

Elaboration Phase

The goal of the Elaboration phase is to establish detailed use cases and device requirements. All the design input

requirements shall be reviewed to ensure they are understood, non-conflicting, correct, unambiguous and complete in

regards to functionality (including interfaces and performance), testability, reliability, serviceability, manufacturability,

and trainability as appropriate. Design input requirements are to address the intended use of the device.

Construction Phase

The goal of the Construction phase is to complete the whole development of the system based on the design input

deliverables, including design verification and validation activities.

PLC Phase UCD Phase UCD Deliverable

Inception Plan UCD Project Plan (that gets integrated into the overall D&D plan)

Inception Research Findings from user research activity summarized in a user research report (e.g.,

contextual inquiry Report)

Inception Model

• User and Domain Analysis Document

Use case validation report

Inception Model List of use-related risks and corresponding design mitigators

Elaboration Design (concept) • Design concept In wireframes/storyboards

Formative usability Evaluation Report

Elaboration Design (Detail) Formative Usability Evaluation Report

Elaboration Design (Detail) UI Design Specification document (UI specification)

Construction Adapt UI specification (updated when applicable)

Construction Adapt Formative Usability Evaluation Report

Construction Assess Summative Usability test (e.g., Benchmark Usability Test Report)

Page 5: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

Varian Medical Systems | Attestation document for §170.314(g)(3) Safety-enhanced design 4

EHRUsabilityTestReport

ARIAOncologyInformationSystemforMedical

Oncology,version11MR5

Product name: ARIA Oncology Information System for Medical Oncology

Product version: version 11 MR5

Date of Usability Test: May 6-9, 2014

Date of Report: May 26, 2014

Report Prepared By: Ross Sutherland, UX Designer. Varian Medical Systems

Contact info: 383 Broadway Avenue, Winnipeg, Manitoba, Canada R3C 4M8; email: [email protected]

Page 6: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

Varian Medical Systems | Attestation document for §170.314(g)(3) Safety-enhanced design 5

Page 7: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

EHR Usability Test Report of Varian Medical Systems ARIA

Product Version: Oncology Information System for Medical Oncology 11.0 MR5

Report based on ISO/IEC 25062:2006 Common Industry Format for Usability Test Reports

Date of Usability Test: May 6-7 2014

Date of Report: May 20, 2014

Report Prepared By:

The Usability People, LLC

4000 Legato Road, Suite 1100

Fairfax, VA 22033

Page 8: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

TABLE OF CONTENTS

Executive Summary.............................................................................................................................................. 3

Introduction ............................................................................................................................................................ 4

Method ...................................................................................................................................................................... 4

Participants......................................................................................................................................................... 4

Study Design ....................................................................................................................................................... 6

Tasks ..................................................................................................................................................................... 7

Test Location ...................................................................................................................................................... 7

Test Environment............................................................................................................................................. 8

Test Forms and Tools ..................................................................................................................................... 9

Participant Instructions ................................................................................................................................. 9

Procedure ......................................................................................................................................................... 10

Usability Metrics ............................................................................................................................................ 11

Data Scoring ................................................................................................................................................ 12

Results .................................................................................................................................................................... 14

Data Analysis and Reporting ..................................................................................................................... 14

Effectiveness and Efficiency ................................................................................................................. 14

Satisfaction .................................................................................................................................................. 15

Discussion of Findings ................................................................................................................................. 18

Effectiveness ............................................................................................................................................... 18

Efficiency ...................................................................................................................................................... 18

Satisfaction .................................................................................................................................................. 19

Summary of Major Findings ................................................................................................................. 21

Risk Analysis .............................................................................................................................................. 21

Areas for Improvement .......................................................................................................................... 22

Appendices ........................................................................................................................................................... 24

Appendix 1: Recruiting Screener ............................................................................................................ 24

Appendix 2: Additional Participant Information .............................................................................. 25

Appendix 3: Informed Consent Form .................................................................................................... 26

Appendix 4: Participant Tasks ................................................................................................................. 27

Appendix 5: System Usability Scale Questionnaire .......................................................................... 31

Appendix 6: Computer System Usability Questionnaire ................................................................ 32

Appendix 7: Detailed task performance for each participant. ..................................................... 34

Page 9: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

Executive Summary

On May 6 and 7, 2014 The Usability People, LLC conducted a usability test of Varian

Medical Systems’ ARIA Oncology Information System for Medical Oncology 11.0 MR5 (ARIA).

The test was conducted in the Winnipeg, Manitoba, Canada offices of Varian Medical over

remote teleconferencing sessions using Go To Meeting. The purpose was to test and validate

the usability of the current user interface and provide evidence of usability of ARIA in the

EHR Under Test (EHRUT). Ten (10) healthcare providers matching the target demographic

criteria participated in the usability test using the EHRUT in simulated, but representative

tasks.

The study focused on measuring the effectiveness of, efficiency of, and satisfaction

with ARIA EHR among a sample of participants representing potential users of the system.

Performance data was collected on sixteen (16) tasks typically conducted on an EHR.

Tasks created were based upon the criteria specified within the test procedure structure

for evaluating conformance of Electronic Health Record (EHR) technology to the

certification criteria defined in 45 CFR Part 170 Subpart C of the Health Information

Technology: Standards, Implementation Specifications, and Certification Criteria for

Electronic Health Record Technology, 2014 Edition; Revisions to the Permanent

Certification Program for Health Information Technology, Final Rule as published in the

Federal Register on September 4, 2012.

Results of the study indicated that the ARIA EHR system was satisfactory with

regards to effectiveness and efficiency and that participants were satisfied with the

usefulness of the system.

Page 10: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

Introduction

The Electronic Health Record System Under Test (EHRUT) tested for this study,

ARIA Oncology Information System for Medical Oncology 11.0 MR5 (ARIA), was specifically

designed to present medical information to physicians, nurses and other healthcare

practitioners on desktop computers in standard outpatient medical care settings. This

study tested and validated the usability of the current user interface and provides evidence

of the usability of ARIA with representative exercises and in realistic user conditions. To

this end, measures of effectiveness and efficiency, such as time on task, number of errors

made, and completion rates were captured during usability testing. Satisfaction was

assessed and user comments collected using two industry-standard questionnaires.

Method

Participants

Ten individuals (5 men and 5 women) participated in the EHRUT(s) on ARIA.

Participants were physicians, nurses, respiratory therapists, practice managers, and other

healthcare practitioners recruited from a database of potential participants maintained by

The Usability People, LLC. The contacts contained within this database were generated via

potential participants responses to postings in popular Internet and social media sites, and

a link at the bottom of The Usability People website. Several participants were also

contacted directly using a list of contacts and email addresses provided by Varian Medical

Systems.. Those who responded to the invitation to take part in the study were directed to

an online questionnaire that served as the participant screener. (The screening

questionnaire is provided as Appendix 1.) Participants meeting the criteria for

participation were contacted and scheduled via email.

Page 11: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

Participants in the usability test of ARIA had a variety of healthcare backgrounds

and demographic characteristics. Table 1 presents participant characteristics, including

demographics, professional experience, computing experience, and number of previous

EHR's used. As compensation for their participation all individuals received a gift card.

(Additional participant background characteristics are presented in Appendix 2.)

Table 1. Participant Characteristics

Part

ID Gender Age Education Role/Title Work facility

Professional

Experience

(yrs)

#EHRs

worked

with

EHR

Experience

(yrs)

Hrs/Wk

with

EHRs?

P1 Female 40-50

Master's

degree in

Biomedical

Informatics

Pharmacy

technician

Medical

Center

University

Hospital

15 2 2 15

P2 Male 40 - 59

Respiratory

Care

Practitioner

Owner

Laboratory for

cardiac,

pulmonary &

sleep

disorders

testing

15 3 7 25

P3 Female 60 - 74 Medical

Practice Mgt

Practice

Manager

Medical

Practice 20 2 5 40

P4 Male 23 - 39 BSHA, R.T.(R)(T)

Administrative

Director,

Oncology

Services

University

setting 12 2 3 30

P5 Female 40 - 59

Healthcare IT

professional;

former

Oncology

Scientist

Healthcare

Consultant;

evaluated

many EHRs

10 7 1 65

P6 Male 40 - 59

Board certified

medical

physicist

Manage

radiation

diagnostic

imaging depts

Free-standing

cancer center 4 2 6 40

P7 Female 23 - 39 MSH

Senior

Application

Analyst

Healthcare

facility 11 4 11 40

P8 Male 40 - 59 Health IT

Instructor Instructor

Community

College 4 7 11 35

P9 Female 40 - 59 RN

Implemented

HIS in clinical

applications;

on medical

leave now

Health

Information

System

company

35 3 15 50

P10 Male 40 - 59 Physician Provider and

CMIO

Private

Practice 9

None of the participants were from the vendor organization (Varian Medical Systems) that

produced and supplied the evaluated system nor did any participant have any direct

Page 12: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

connection to the testing organization (The Usability People, LLC). Most participants did

not have any direct experience or training using the ARIA EHR system.

Study Design

The overall objective of this usability test was to uncover areas where the ARIA EHR

application performed well – that is, effectively, efficiently, and with satisfaction – and

areas where the application failed to serve the needs of users. Data from this test may be

used as a baseline for future tests of updated versions of ARIA and/or for comparing ARIA

with other EHRs presenting the same tasks. In short, this testing serves as both a means to

record or benchmark current usability and to identify areas where improvements can be

made.

Participants completed the test of ARIA usability during individual 60-minute Go To

Meeting sessions. During the test each participant interacted with various components of

the ARIA EHR. Each participant was provided with the same instructions.

ARIA was evaluated for effectiveness, efficiency and satisfaction as defined by the

following measures collected and analyzed for each participant:

• Number of tasks successfully completed without assistance

• Time to complete the tasks

• Number and description of errors

• Path deviations

• Participant’s verbalizations (comments)

• Participant’s satisfaction ratings of the system

Page 13: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

Tasks

A total of 16 tasks were constructed to be realistic and representative of the

activities a user might engage with ARIA in actual medical settings. The 16 tasks were

created based upon the criteria specified within the test procedure structure for evaluating

conformance of Electronic Health Record (EHR) technology to the certification criteria as

defined in 45 CFR Part 170 Subpart C of the Health Information Technology: Standards,

Implementation Specifications, and Certification Criteria for Electronic Health Record

Technology.

The tasks focused on the following issues:

1. Computerized Provider Order Entry System (CPOE) (§ 170.314(a)(1));

2. Drug- drug, drug-allergy interaction checks (§ 170.314(a)(2));

3. Medication list (§ 170.314(a)(6));

4. Medication allergy list (§ 170.314(a)(7));

5. Clinical decision support (§ 170.314(a)(8));

6. Electronic medication administration record (§ 170.314(a)(16));

7. Electronic prescribing (§ 170.314(b)(3));

8. Clinical information reconciliation (§ 170.314(b)(4)).

A copy of the tasks presented to participants in the usability test of ARIA EHR can be found

in Appendix 4.

Test Location

All participants were tested on the ARIA system during remote conferencing sessions

using Go To Meeting. Each participant was requested in advance to secure a quiet room with

minimal distractions and a computer that could connect to the Internet with a Go To Meeting

Page 14: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

session. During a given Go To Meeting session, only the test administrator and that participant

communicated with one another.

The Go To Meeting usability test session was conducted by a test administrator from

the testing organization (The Usability People, LLC) working from a small conference room

at Varian Medical Systems’ Winnipeg, Manitoba, Canada location. Seated near the

administrator, a data logger from the testing organization took detailed notes on each

session, including user comments and satisfaction ratings following each task. A

representative from Varian Medical systems also sat in to observe the session and provided

technical assistance running ARIA. During a session the test administrator, the data logger,

and the Varian representative could see only the participant’s screen and hear the

participant’s comments, questions, and responses.

Test Environment

While the EHRUT typically would be used in a healthcare office or ambulatory

surgery center facility, testing of the ARIA EHR system was conducted via remote

connection during individual Go To Meeting sessions. Each participant called into a Go To

Meeting session and was connected by the test administrator to the application.

The ARIA application itself ran on a Windows platform on a LAN connection using a

sample database set up specifically for the test. Participants used a mouse and keyboard

when interacting with the EHRUT and were given remote control of the administrator’s

workstation to perform the tasks.

Page 15: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

Test Forms and Tools

As part of the usability test, several documents and instruments were used. Examples

of the documents used during the usability test, including an informed consent form, the

tasks, and post-test questionnaires, can be found in Appendices 3-6, respectively.

Participants’ interaction with the ARIA was captured and recorded digitally using Morae

screen capture software running on the test administrator’s workstation. Verbal responses

were recorded through either the microphone integrated into the participant’s computer

or through a telephone connection. This information was electronically transmitted to the

administrator and to the data logger during each test session.

Participant Instructions

The administrator read the following instructions aloud to each participant:

Thank you for participating in this study. Your input is very important. Our session

today will last about 60 minutes. During that time you will use an instance of an electronic

health record. I will ask you to complete a few tasks using this system and answer some

questions.

Please note that we are not testing you; we are testing the system. Therefore if you

have any difficulty this may mean that something needs to be improved in the system. I will be

here in case you need specific help, but I am not able to instruct you or provide help in how to

use the application.

Overall, we are interested in how easy (or how difficult) this system is to use, what in it

would be useful to you, and how we could improve it. I did not have any involvement in its

creation, so please be honest with your opinions. All of the information that you provide will

be kept confidential and your name will not be associated with your comments at any time.

Should you feel it necessary you are able to withdraw at any time during the testing.

Participants were then given sixteen (16) tasks to complete.

Page 16: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

Procedure

Upon connection to the online meeting tool (Go To Meeting), each participant was

greeted, his or her identity verified, and matched to a name on the participant schedule.

Participant names were replaced with participant IDs so that a given individual’s data

cannot be linked to his/her identity. Prior to beginning testing, each participant reviewed

and signed an informed consent form (See Appendix 3) and emailed it to the organization

(The Usability People, LLC) conducting the test.

Two staff members of the Usability People, a usability test administrator and a data

logger, administered the test. The administrator moderated the session by providing both

verbal and written instructions for the overall usability test and for each of the tasks

comprising the test. The administrator also monitored task success, path deviations,

number and description of errors, and audio-recorded participant verbal comments. The

data logger task times, obtained post-task rating data, and took notes on participant

comments and administrator feedback.

For each of the sixteen (16) tasks, participants were presented written instructions

to their computers. Following the administrator’s instructions, each participant performed

each task by first reading the task out loud then stating in his or her own words his or her

interpretation of the task requirements. When the participant’s interpretation matched the

actual goal of the task, the administrator instructed the participant to begin and task timing

began. Task time was stopped and recorded when the test administrator observed on his

workstation that the participant had successfully completed the task. If a participant failed

to complete a task before the expected amount of time for each task, that task was marked

as “Timed Out.” After each task, the test administrator asked the participant, “On a scale

Page 17: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

from 1 to 5, where 1 is ‘Not Satisfied’ and 5 is ‘Very Satisfied,’ how satisfied were you with

this task?” This same procedure was conducted for each of the sixteen (16) tasks.

Following completion of the 16 EHR tasks, the administrator electronically

presented to the participant two post-test questionnaires (System Usability Scale (SUS: see

Appendix 5) and Computer System Usability Questionnaire (CSUQ; see Appendix 6). After

the participant completed both questionnaires, the administrator thanked each participant

for his or time and allowed the participant to make any comments on or ask any questions

about the system and tasks presented. For each session, the participant’s schedule,

demographic information, task success rate, time on task, errors, deviations, verbal

responses, and post-test questionnaire were recorded. The system was then reset to

proper test conditions for the next participant.

Usability Metrics

According to the NIST Guide to the Processes Approach for Improving the Usability of

Electronic Health Records (NIST IR 7741, November, 2010) EHRs should support a process

that provides a high level of usability for all users. The goal is for users to interact with the

system effectively, efficiently, and with an acceptable level of satisfaction. To this end,

metrics for effectiveness, efficiency and user satisfaction were captured during the

usability testing. The goals of the test were to assess:

• Effectiveness of ARIA by measuring participant success rates and errors

• Efficiency of ARIA by measuring the average task time and path deviations.

• Satisfaction with ARIA by measuring ease of use ratings.

Page 18: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

Data Scoring

Table 2 details how tasks were scored, errors evaluated, and the time data analyzed:

Table 2. Scoring Protocols for Effectiveness, Efficiency, and Satisfaction

Measures Rationale and Scoring

Effectiveness:

• Task Success

A task was counted as “Success” if the participant was able to achieve

the correct outcome, without assistance, within the time allotted on a

per task basis.

The total number of successes was calculated for each task and then

divided by the total number of times that task was attempted. Results

are provided as a percentage.

Effectiveness:

• Task Failures

If the participant abandoned the task, did not reach the correct answer

or performed it incorrectly, or reached the end of the allotted time

before successful completion, the task was counted as “Fail.” No task

times were taken for failed attempts.

The total number of errors was calculated for each task and divided by

the total number of times that task was attempted. Results are

presented as the average error rate.

Note: Not all deviations are counted as errors

Effectiveness:

• Prompted

Successes

Because some tasks are dependent upon the successful completion of

previous tasks, participants may receive a limited number of

“prompts” to help prepare the system data for the pre-requisites for

subsequent tasks.

When a participant was able to complete the data entry on a task with

3 or fewer prompts, the task was counted as an “Assisted” competition.

No task times were recorded for Assisted completions.

Efficiency:

• Task

Deviations

The participant’s path (i.e., steps) through the application was

recorded. Deviations occur if for example, the participant navigated to

an incorrect screen, clicked on an incorrect menu item, followed an

incorrect link, or interacted incorrectly with an on-screen control.

Efficiency:

• Task Time

Each task was timed from the administrator’s prompt “Begin” until

said, “Done.” If the participant failed to say, “Done,” timing stopped

when the participant stopped performing the task.

Only task times for tasks that were successfully completed were

included in the average task time analysis. Average time per task was

calculated for each task.

Page 19: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

Satisfaction:

• Ease of Use

ratings

• System

Satisfaction

Participant’s subjective impression of the ease of use of the application

was measured by administering both a single post-task question as

well as two post-session questionnaires.

After each task, the participant determined on a “scale of 1 (being “Not

Satisfied”) to 5 (being “Very Satisfied”) “ their subjective satisfaction

with performance on the task. These data are averaged across

participants.

To measure participants’ confidence in and likeability of the ARIA EHR

overall, the testing team administered electronic versions of the

System Usability Scale (SUS) and the Computer System Usability

Questionnaire (CSUQ). See the SUS questionnaire as Appendix 5., and

the CSUQ as Appendix 6.

Page 20: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

Results

Data Analysis and Reporting

The results of the usability test of the ARIA EHR were analyzed according to the

methods described in the Usability Metrics section above and are detailed below. Note that

results should be evaluated relative the objectives and goals as outlined in the Study Design

section above. The data should yield actionable results that, if corrected, yield material,

positive impact on user performance.

Effectiveness and Efficiency

Table 3 presents a summary of task performance showing task average time on task,

expected task times, task completion rates, path deviations and task satisfaction:

Table 3. Usability Test Results

Task Mean

Task

Time

Expected

Task

Time

Completion

Rate (%)

Mean #

Path

Deviations

Mean Task

Satisfaction

TASK 1: Configuration of CDS interventions 1:04 2:30 44% 2.88 2.9

TASK 2: Clinical information reconciliation. 4:09 5:00 50% 2.33 2.3

TASK 3: Demographics CDS Intervention. 0:51 2:00 40% 1.89 1.9

TASK 4: Vital Signs CDS Intervention. 1:25 2:00 90% 4.00 4.0

TASK 5: Medication allergy list (access, record).

Medication allergy list CDS Intervention

2:07 3:00 80% 2.78 2.8

TASK 6: Medication allergy list (access, change) 1:14 1:30 70% 3.00 2.8

TASK 7: TASK 7: Medication list (access, change). 1:10 3:30 89% 2.89 2.9

TASK 8: Electronic prescribing, View User Diagnostic

and Therapeutic Reference Information.

3:35 3:00 33% 2.56 2.6

TASK 9: Radiology orders (access, record). 1:19 2:00 100% 4.50 4.5

TASK 10: Radiology orders (access, change). 2:10 2:00 100% 3.88 3.9

TASK 11: Laboratory orders (access, record). 1:12 2:00 100% 4.25 4.3

TASK 12: Laboratory orders (access, change). 1:06 2:00 100% 4.71 4.7

TASK 13: Lab result CDS Intervention. 1:28 3:00 43% 2.29 2.3

TASK 14: Adjustment of severity level of drug-drug

interventions.

1:44 2:00 50% 2.83 2.8

TASK 15: Medication order (record), View drug-drug

and drug-allergy interventions, Medication List

Interventions.

4:38 4:00 71% 3.29 3.3

TASK 16: Medication order (access, change). 1:04 2:00 100% 4.14 4.2

Page 21: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

As Table 3 shows, relative to expected performance standards as defined by Varian Medical

Systems, participants in the ARIA usability test performed very well.

Satisfaction

Individual Task Satisfaction

Participants verbally indicated their satisfaction with each task using a scale of “1” (“Not

Satisfied”) to “5,” (“Very Satisfied”). As Figure 1 shows individual task satisfaction ranged

from a low of 1.9 out of 5 on Task 3 (Clinical information reconciliation) to a high of 4.7 on

Task 12 (Laboratory orders).

Figure 1. Satisfaction Ratings of Individual Tasks

T a s k N u m b e r

System Usability Scale

The System Usability Scale (SUS) is a simple, 10-item Likert-type attitude scale

providing a global subjective assessment of usability from the user’s perspective (John

2.9

2.3

1.9

4.0

2.8 2.82.9

2.6

4.5

3.9

4.3

4.7

2.3

2.8

3.3

4.2

0.0

0.5

1.0

1.5

2.0

2.5

3.0

3.5

4.0

4.5

5.0

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16

Page 22: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

Brooke at Digital Equipment Company developed the SUS in 1986). The SUS scale is scored

from 0 to 100; scores under 60 represent systems with less than optimal usability, scores

over 80 are considered better than average. See Appendix 5 for a copy of the SUS.

The mean total SUS score for the ARIA EHR was 48 and ranged from a low of 13 and

a high of 98. Overall, participant-users rated their satisfaction on the SUS with the ARIA

system to be less than optimal.

Computer System Usability Questionnaire

Using the Computer System Usability Questionnaire (CSUQ; Lewis, J. R. (1995). (See:

IBM Computer Usability Satisfaction Questionnaires: Psychometric Evaluation and

Instructions for Use. International Journal of Human-Computer Interaction, 7:1, 57-78).),

participants rated each of 19 items on a scale from 1 to 7, with a rating of 7 being most in

agreement with the positively-worded item. Responses for each item were summed and

averaged to four scales – Interface Quality, Information Quality, System Usefulness- and an

overall scale. See Appendix 6 for a copy of the CSUQ.

Figure 2 displays CUSQ ratings for each of the four scales. In general, participants in

the ARIA EHR study rated system usability to be about what would be expected, with most

scores for each of the three subscales and the overall scale at or above the midpoint on

each 7-point scale. Interface Quality received the highest average score for the 10

participants with 4.53; on Information Quality the average score was 3.53; on System

Usefulness the average score was 3.58; and the overall average CUSQ score 3.72.

Page 23: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

Figure 2. Computer System Usability Questionnaire (CSUQ)

2.50 3.50 4.50 5.50 6.50

Overall Score

System Usefulness

Information Quality

Interface Quality

User Rating (Out of 7)

Page 24: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

Discussion of Findings

In general the participants felt satisfied with the ARIA EHR. A few of the participants

struggled with some tasks and were unable to successfully complete a number of them.

Several of the participants were not able to complete some tasks given the time constraints

of a summative test. Participants, however, were mostly able to perform tasks successfully

with some minor assistance. The participant performance rate and the participant

satisfaction rates would suggest that the ARIA EHR system is a usability system with a small

number of usability issues that are likely to be easily overcome with some minor “fixes.”

Effectiveness

Of the sixteen (16) tasks presented, Five (5) tasks that were successfully completed

by all participants. The lowest task completion rate for any task was thirty-three (33)

percent. There were not any tasks that were not successfully completed by any participant.

Over all participants, the mean successful task competition rate was high seventy-three

(73) percent, indicating that few participants had difficulty with the tasks.

Prior experience with EHR systems was positively related to successful task

performance; participants with prior EHR experience were more likely to successfully

complete tasks than those without prior experience. Those participants who had previous

experience and/or training using the ARIA EHR were the most effective.

Efficiency

Participants who successfully completed tasks generally completed those tasks

within an acceptable time. Some tasks were completed more quickly than the calculated

expected time; many were almost equal to this time, while several tasks took much longer

than expected. The tasks that took the longest required the participants to navigate to a

Page 25: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

particular page, interact with a complex workflow, and locate and select specific actions.

Those tasks could be performed more quickly with an update to the information

architecture, an increase in the amount of embedded assistance, a more consistent support

of a right-click interaction and perhaps an enhanced visual indication of primary or

secondary actions.

A few participants made a number of errors when attempting to navigate toward

solving their assigned tasks. Many of these errors may be associated with those

participants not being familiar with the EHR system, and not understanding the presented

information architecture of the ARIA system. As noted above, prior experience with EHR

systems was related to successful task completion. Similarly, experience and practice with

the given system may have positive effects with regard to user efficiency. Those

participants who had previous experience and/or training using the ARIA EHR were the

most efficient.

Satisfaction

Participants were generally satisfied with the ARIA EHR system; ratings on the SUS

(mean=48) and the CSUQ (Overall score =(3.72) demonstrated a satisfaction with the

system.

On the CSUQ, participants ranked the scale “(Interface Quality)” highest of the three

scales, suggesting that the system was well liked visually. Individual task satisfaction

ratings were related to individual user performance. Those participants who were able to

successfully complete tasks were also more likely to rank those tasks as satisfying, while

those participants who did poorly or were not able to complete a task ranked those tasks as

Page 26: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

unsatisfying. Overall however, participant satisfaction with ARIA was about what was

expected given the performance data.

Page 27: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

Summary of Major Findings

This evaluation demonstrated that the ARIA EHR system is a usable system with a

relatively short learning curve. Participants who had never used the ARIA EHR system

before the study experienced minor difficulty understanding the navigation and

information architecture, the results suggest that with minor changes their performance

and satisfaction would likely improve. Those participants who had previous experience

and/or training using the ARIA EHR were the most efficient, effective and satisfied.

Risk Analysis

The following table presents a prioritized list of tasks prioritized by the risk of error as

observed during the testing.

Table 5. Risk Analysis

T A S K Percent

Complete

Risk

Status

TASK 1: Configuration of CDS interventions 44% Low

TASK 2: Demographics CDS Intervention. 50% Low

TASK 3: Clinical information reconciliation 40% Low

TASK 4: Vital Signs CDS Intervention 90% None

TASK 5: Medication allergy list, medication allergy lists

CDS Intervention. 80% Very Low

TASK 6: Medication allergy list 70% Very Low

TASK 7: Medication list 89% None

TASK 8: Electronic prescribing, view user diagnostic and

therapeutic reference Information. 33% Moderate

TASK 9: Radiology orders 100% None

TASK 10: Radiology orders. 100% None

TASK 11: Laboratory orders 100% None

TASK 12: Laboratory orders 100% None

TASK 13: Lab result CDS Intervention. 43% Low

TASK 14: Problem List CDS Intervention. 50% Low

TASK 15: Adjustment of severity level of drug -drug

interventions 71% Very Low

TASK 16: Medication order, View drug-drug and drug-

allergy interventions, medication List Interventions 100% None

Page 28: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

Areas for Improvement

The following is a list of potential areas for improvement. Making these and other

minor enhancements will improve the overall user experience of the ARIA system and

increase the effectiveness, efficiency, and satisfaction for both experienced and novice ARIA

users.

• Button location and nomenclature consistency.

o The location of, and wording on a number of buttons that performed the

same or similar tasks should be more consistent

• Indication of current location and/or functionality.

o Some participants tried to navigate to their current location. Possibly

because they were not given a clear indication of where they were in the

system.

o Many participants tried to make edits when they were in a 'read only' view of

the information. Using different controls or providing more textual

information would help to reduce a number of the associated errors made by

participants.

• Icons and tool tips.

o A number of icons were presented without an associated 'tool-tip'.

Participant that were not familiar with the system were not able to

determine the functionality without a tool-tip.

• Error and Warning messages.

o A number of participants had difficulty perceiving error or warning

messages. An error message/warning message strategy that provides a

consistent location and constant tone for these important messages will help

improve the user satisfaction rates.

Page 29: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of
Page 30: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

Appendices

Appendix 1: Recruiting Screener

1. Are you male or female?

2. Have you participated in a focus group or usability test in the past 6 months?

3. Do you, or does anyone in your home, work in marketing research, usability research, and/or

web design?

4. Do you, or does anyone in your home, have a commercial or research interest in an electronic

health record software or consulting company?

5. Which of the following best describes your age?

_____ 23 to 39; _____ 40 to 59; _____ 60 to 74; _____ 75 or older.

6. Do you require any assistive technologies to use a computer?

7. Please list your medical or nursing credentials

8. How long have you held this position? (no. of months):

9. What type of facility do you work in and what is your role there?

10. How are medical records handled at your (main) workplace?

_____All Paper _____Some Paper/Some Electronic ___All Electronic

11. How many EHRs do you use or have you worked with?

12. How many years have you used an electronic health record?

13. About how many hours per week do you spend using a computer?

14. What computer platform(s) do you usually use?

15. In the last month, about how often have you used an electronic health record?

_____Did not use last month ___Every day _____A few times a week.

Page 31: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

Appendix 2: Additional Participant Information

Participant

Number

Do you, or

does

anyone in

your home,

work in

marketing

research,

usability

research,

web design

or Electronic

Health

Record

software?

Have you

participated

in a focus

group or

usability test

in the past 3

months?

Do you, or

does anyone

in your home,

have a

commercial

or research

interest in an

electronic

health record

software or

consulting

company?

About how

many hours

per week

do you

spend using

a

computer?

What

computer

platform(s)

do you

usually

use?

How are

medical

records

handled at

your facility

Last

month,

how

often

used

EHR?

P1 No No No 40 PC All Electronic Every

day

P2 No No No 25 PC;

Apple/Mac All Electronic

Every

day

P3 No No No 40 PC

Some

Paper/Some

Electronic

A few

times

per

week

P4 No No No 30 PC;

Apple/Mac All Electronic

Every

day

P5 No No No 65 PC

Some

Paper/Some

Electronic

I do not

use an

EHR

system

P6 No No No 40 PC;

Apple/Mac All Electronic

Every

day

P7 No No Yes 40 PC All Electronic Every

day

P8 No No No 35 PC;

Apple/Mac

Some

Paper/Some

Electronic

A few

times

per

week

P9 No No No 50 PC

Some

Paper/Some

Electronic

A few

times

per

month

P10 No No No

Page 32: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

Appendix 3: Informed Consent Form

Varian Medical Systems would like to thank you for participating in this study. The purpose

of this study is to evaluate an electronic health records system. If you decide to participate,

you will be asked to perform several tasks using the prototype and give your feedback. The

study will last about 60 minutes.

Agreement

I understand and agree that as a voluntary participant in the present study conducted by

Varian Medical Systems. I am free to withdraw consent or discontinue participation at any

time. I understand and agree to participate in the study conducted and recorded by the

Varian Medical Systems.

I understand and consent to the use and release of the video recording by Varian Medical

Systems. I understand that the information and videotape is for research purposes only and

that my name and image will not be used for any purpose other than research. I relinquish

any rights to the video and understand the video recording may be copied and used by

Varian Medical Systems. without further permission.

I understand and agree that the purpose of this study is to make software applications

more useful and usable in the future.

I understand and agree that the data collected from this study may be shared with outside

of Varian Medical Systems. I understand and agree that data confidentiality is assured,

because only de-identified data – i.e., identification numbers not names – will be used in

analysis and reporting of the results.

I agree to immediately raise any concerns or areas of discomfort with the study

administrator. I understand that I can leave at any time.

Please check one of the following:

____YES, I have read the above statement and agree to be a participant.

____NO, I choose not to participate in this study.

Signature: _____________________________________ Date _____________________

Page 33: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

Appendix 4: Participant Tasks (ARIA Oncology Information System for Medical Oncology 11.0 MR5)

TASK 1: Configuration of CDS interventions

You have a few free minutes before your next patient visit and you want to view and configure

(activate, deactivate) some of the Clinical Decision Support (CDS) rules within the ARIA Planner

application.

Check the status of each of the following rules and make sure that all of the following rules

are “active”:

• Anemia Support

• Antiemesis

• Antiemetics

• Bone Mets

• CDSSupport

• Growth Factors

• Fall Risk

TASK 2: Clinical information reconciliation, Problem List CDS intervention

Your patient, Josephine Baker, has brought summary information from another provider in C-

CDA file on an external drive (USB thumb drive). You have copied the information from external

drive and uploaded this document to the system. The following information was recognized:

• Problems

o Allergic Asthma

o B-cell lymphoma

o Malignant neoplasm of upper-inner quadrant of female breast

• Medications

o Ibuprofen - oral capsule take as directed

o Lipitor - oral tablet take as directed

o Lorazepam - oral tablet take as directed

o Zofran - solution oral take as directed

• Medication Allergies

o Compazine - nausea/vomiting/diarrhea

o Nuts - shock/unconsciousness

o Penicillin V Potassium - chest pains / irregular heart beat

o Shellfish - skin rash/hives

Use the ARIA Manager application to view and compare the uploaded medical records

with the existing records to create a single reconciled list of medications, medication

allergies, and problems.

TASK 3: Demographics CDS Intervention.

Page 34: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

Locate and acknowledge the reminder to assess the patient fall risk.

TASK 4: Vital Signs CDS Intervention.

Your assistant has measured patient’s height, weight and blood pressure, and provided you

with following values:

• Height: 65 inches

• Weight: 68 kg

• Blood pressure: 150/110

Enter this information to the EHR.

TASK 5: Medication allergy list (access, record). Medication allergy list CDS Intervention.

During the visit, your patient has told you that after taking amoxicillin they have experienced

some nausea and vomiting.

You decide to add allergy for amoxicillin to their record.

Enter this information into the EHR.

TASK 6: Medication allergy list (access, change)

Your patient has told you that they no longer have an allergy to codeine.

Edit this information in the EHR.

TASK 7: Medication list (access, change).

Because your patient has told you of the allergy to amoxicillin, you need to stop the patient

taking from amoxicillin.

Remove amoxicillin from their current medication list.

TASK 8: Electronic prescribing, View User Diagnostic and Therapeutic Reference Information.

In order to treat your patient, you need to replace the amoxicillin with something the patient is

not allergic to.

Prescribe Doxycycline Hyclate 100mg to take one tablet twice a day after meals for 5 days.

Your patient asked you to send this prescription to the pharmacy electronically.

TASK 9: Radiology orders (access, record).

During the examination of your patient, you decide to create a physician order for radiology:

Page 35: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

• CT scan imaging, pelvis; without contrast material(s).

Enter this information into the EHR.

TASK 10: Radiology orders (access, change).

In order to rule out a diagnosis of Pyelonephritis you decide to change the radiology order that

you just entered and instead refer patient to:

• 76775: Ultrasound, diagnostic and retroperitoneal.

Edit radiology order and enter necessary information.

TASK 11: Laboratory orders (access, record).

You want to enter a physician order for lab work as follows:

• Creatinine 24 hr CrCl in Urine lab test.

Enter the lab order into the EHR.

TASK 12: Laboratory orders (access, change).

You decide to change this lab order and add an additional test:

• Protein (24 hour) in Urine.

Edit previously added order to include the additional test.

TASK 13: Lab result CDS Intervention.

During your current patient visit, you have received the results from a test that was ordered on

May 7:

North Lab Req -- Lab order:

• Erythrocyte sedimentation rate.

Lab result:

• Specimen: Blood

• Result:20

• Unit:mm/h

Enter this information to the EHR.

TASK 14: Adjustment of severity level of drug-drug interventions.

Page 36: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

Using the ARIA Security application,

adjust the minimum severity level of drug-drug interventions in the Medi-Span portion of

the system to “minor.”

TASK 15: Medication order (record), View drug-drug and drug-allergy interventions,

Medication List Interventions.

Prescribe Naproxen 250 mg to take one tablet, once a day for 5 days.

During prescribing, you will be notified about drug-allergy interaction alert and drug-drug

interaction alerts, because of this you decide to also prescribe Oxycodone HCL 15mg tablet

to take one tablet, twice a day, for 5 days.

TASK 16: Medication order (access, change).

You find out that during ordering Oxycodone you have made a mistake in the dose and need to

change it.

Change: ”Take 1 tablet twice a day “

To : “Take 1 tablet once a day “

Edit the medication order and print this prescription for your patient.

Page 37: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

Appendix 5: System Usability Scale Questionnaire

Page 38: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

Appendix 6: Computer System Usability Questionnaire

Please provide your impression of the usability of the system by answering each of the

questions below:

1. Overall, I am satisfied with how easy it is to use this system

Strongly 1 2 3 4 5 6 7 NA Strongly

Disagree Agree

2. It was simple to use this system

Strongly 1 2 3 4 5 6 7 NA Strongly

Disagree Agree

3. I can effectively complete my work using this system

Strongly 1 2 3 4 5 6 7 NA Strongly

Disagree Agree

4. I am able to complete my work quickly using this system

Strongly 1 2 3 4 5 6 7 NA Strongly

Disagree Agree

5. I am able to efficiently complete my work using this system

Strongly 1 2 3 4 5 6 7 NA Strongly

Disagree Agree

6. I feel comfortable using this system

Strongly 1 2 3 4 5 6 7 NA Strongly

Disagree Agree

7. It was easy to learn to use this system

Strongly 1 2 3 4 5 6 7 NA Strongly

Disagree Agree

8. I believe I became productive quickly using this system

Strongly 1 2 3 4 5 6 7 NA Strongly

Disagree Agree

9. The system gives error messages that clearly tell me how to fix problems

Strongly 1 2 3 4 5 6 7 NA Strongly

Disagree Agree

10. Whenever I make a mistake using the system, I recover easily and quickly

Strongly 1 2 3 4 5 6 7 NA Strongly

Disagree Agree

Page 39: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

11. The information (such as online help, on-screen messages, and other documentation)

provided with this system is clear

Strongly 1 2 3 4 5 6 7 NA Strongly

Disagree Agree

12. It is easy to find the information I needed

Strongly 1 2 3 4 5 6 7 NA Strongly

Disagree Agree

13. The information provided for the system is easy to understand

Strongly 1 2 3 4 5 6 7 NA Strongly

Disagree Agree

14. The information is effective in helping me complete the tasks and scenarios

Strongly 1 2 3 4 5 6 7 NA Strongly

Disagree Agree

15. The organization of information on the system screens is clear

Strongly 1 2 3 4 5 6 7 NA Strongly

Disagree Agree

16. The interface of this system is pleasant

Strongly 1 2 3 4 5 6 7 NA Strongly

Disagree Agree

17. I like using the interface of this system

Strongly 1 2 3 4 5 6 7 NA Strongly

Disagree Agree

18. This system has all the functions and capabilities I expect it to have

Strongly 1 2 3 4 5 6 7 NA Strongly

Disagree Agree

19. Overall, I am satisfied with this system

Strongly 1 2 3 4 5 6 7 NA Strongly

Disagree Agree

Page 40: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

Appendix 7: Detailed task performance for each participant.

Participant

Number

Task 1 (Configuration of CDS interventions)

Task Time Outcome

# Path

Deviations

Task

Satisfaction

p1 5:40 Time Out 6

p2 6:13 Time Out 4 2

p3 0:33 Success 0 3

p4 2:12 Assisted 6 5

p5 1:59 Assisted 4 1

p6 0:22 Success 0 5

p7

p8 2:29 Time Out 4 1

p9 1:58 Success 3 1

p10 1:26 Success 1 5

Expected time 5:00

Average Time on Task 1:04

Average Task Satisfaction 2.88

Average #Path

Deviations 3.11

Percent Success 44%

Participant

Number

Task 2 (Clinical information reconciliation, Problem List

CDS intervention)

Task Time Outcome

# Path

Deviations

Task

Satisfaction

p1 8:32 Time Out 7

p2 7:35 Time Out 8 3

p3 5:11 Success 1 1

p4 2:57 Success 2 5

p5 5:02 Assisted 4 3

p6 3:59 Success 2 2

p7 6:39 Time Out 7 3

p8 2:56 Fail 6 1

p9 3:35 Success 0 1

p10 5:03 Success 2 2

Expected Time 5:00

Average Time on Task 4:09

Average Task Satisfaction 2.33

Average #Path

Deviations 3.90

Percent Success 50%

Page 41: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

Participant

Number

Task 3 (Demographic CDS intervention)

Task Time Outcome

# Path

Deviations

Task

Satisfaction

p1 5:27 Time Out 6

p2 2:45 Time Out 4 2

p3 3:42 Assisted 1 1

p4 2:04 Time Out 5 2

p5 1:46 Time Out 3 4

p6 2:05 Time Out 4 1

p7 0:31 Success 1 3

p8 1:36 Success 2 1

p9 0:34 Success 2 1

p10 0:46 Success 1 2

Expected Time 2:00

Average Time on Task 0:51

Average Task Satisfaction 1.89

Average #Path

Deviations 2.9

Percent Success 40%

Participant

Number

Task 4 (Vital signs CDS intervention)

Task Time Outcome

# Path

Deviations

Task

Satisfaction

p1 1:43 Success 0

p2 4:35 Assisted 2 5

p3 1:20 Success 0 5

p4 0:50 Success 0 5

p5 2:15 Success 3 3

p6 1:08 Success 0 4

p7 0:52 Success 0 5

p8 1:57 Success 3 3

p9 1:10 Success 0 3

p10 1:32 Success 0 3

Expected Time 2:00

Average Time on Task 1:25

Average Task Satisfaction 4.00

Average #Path

Deviations 0.80

Percent Success 90%

Page 42: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

Participant

Number

Task 5 (Medication allergy list, Medication list CDS

interventions)

Task Time Outcome

# Path

Deviations

Task

Satisfaction

p1 6:35 Time Out 6

p2 9:57 Time Out 9 4

p3 1:18 Success 0 4

p4 2:00 Success 3 4

p5 3:03 Success 4 2

p6 2:35 Success 1 3

p7 0:58 Success 0 5

p8 1:52 Success 0 1

p9 2:09 Success 1 1

p10 3:07 Success 4 1

Expected Time 3:00

Average Time on Task 2:07

Average Task Satisfaction 2.78

Average #Path

Deviations 2.80

Percent Success 80%

Participant

Number

Task 6 (Medication allergy list - change)

Task Time Outcome

# Path

Deviations

Task

Satisfaction

p1 4:33 Time Out 4

p2 5:27 Time Out 4 3

p3 0:50 Success 1 3

p4 0:28 Success 0 3

p5 2:03 Success 3 3

p6 1:25 Success 2 3

p7 1:05 Success 1 3

p8 2:12 Success 2 3

p9 2:02 Fail 5 3

p10 0:39 Success 0 3

Expected Time 1:30

Average Time on Task 1:14

Average Task Satisfaction 3.00

Average #Path

Deviations 2.44

Percent Success 70%

Page 43: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

Participant

Number

Task 7 (Medication list)

Task Time Outcome

# Path

Deviations

Task

Satisfaction

p1

p2 1:36 Success 0 2

p3 6:25 Time Out 4 2

p4 0:43 Success 0 4

p5 2:04 Success 1 3

p6 0:50 Success 0 2

p7 0:37 Success 0 5

p8 1:40 Success 2 3

p9 1:19 Success 0 3

p10 0:37 Success 0 2

Expected Time 3:30

Average Time on Task 1:10

Average Task Satisfaction 2.89

Average #Path

Deviations 0.78

Percent Success 89%

Participant

Number

Task 8 (Electronic prescribing)

Task Time Outcome

# Path

Deviations

Task

Satisfaction

p1

p2 3:35 Time Out 2 1

p3 4:36 Time Out 2 3

p4 3:01 Success 0 4

p5 4:20 Fail 6 5

p6 4:29 Success 2 3

p7 3:16 Success 0 4

p8 4:26 Time Out 2 1

p9 4:04 Time Out 2 1

p10 4:07 Fail 6 1

Expected Time 2:00

Average Time on Task 3:35

Average Task Satisfaction 2.6

Average #Path

Deviations 2.44

Percent Success 33%

Page 44: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

Participant

Number

Task 9 (Radiology orders)

Task Time Outcome

# Path

Deviations

Task

Satisfaction

p1

p2

p3 2:54 Success 0 5

p4 0:29 Success 0 5

p5 1:22 Success 0 5

p6 0:49 Success 0 4

p7 1:08 Success 0 5

p8 0:31 Success 0 4

p9 1:23 Success 0 4

p10 1:59 Success 0 4

Expected Time 2:00

Average Time on Task 1:19

Average Task Satisfaction 4.5

Average #Path

Deviations 0.00

Percent Success 100%

Participant

Number

Task 10 (Radiology orders - change)

Task Time Outcome

# Path

Deviations

Task

Satisfaction

p1

p2

p3 3:57 Success 1 3

p4 1:16 Success 0 4

p5 1:15 Success 0 5

p6 3:31 Success 1 3

p7 2:16 Success 1 4

p8 0:49 Success 0 4

p9 3:20 Success 2 4

p10 1:00 Success 0 4

Expected Time 2:00

Average Time on Task 2:10

Average Task Satisfaction 3.875

Average #Path

Deviations 0.63

Percent Success 100%

Page 45: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

Participant

Number

Task 11 (Lab orders)

Task Time Outcome

# Path

Deviations

Task

Satisfaction

p1

p2

p3 2:14 Success 1 5

p4 0:33 Success 0 5

p5 1:49 Success 0 5

p6 0:42 Success 0 4

p7 1:39 Success 0 5

p8 0:32 Success 0 4

p9 0:46 Success 0 4

p10 1:25 Success 0 2

Expected Time 2:00

Average Time on Task 1:12

Average Task Satisfaction 4.25

Average #Path

Deviations 0.13

Percent Success 100%

Participant

Number

Task 12 (Lab orders - change)

Task Time Outcome

# Path

Deviations

Task

Satisfaction

p1

p2

p3 1:42 Success 0 5

p4 0:58 Success 1 4

p5

p6 0:48 Success 1 4

p7 2:43 Success 0 5

p8 0:39 Success 0 5

p9 0:17 Success 0 5

p10 0:36 Success 0 5

Expected Time 2:00

Average Time on Task 1:06

Average Task Satisfaction 4.71

Average #Path

Deviations 0.29

Percent Success 100%

Page 46: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

Participant

Number

Task 13 (Lab results CDS intervention)

Task Time Outcome

# Path

Deviations

Task

Satisfaction

p1

p2

p3 5:50 Assisted 3 1

p4 3:39 Assisted 2 2

p5

p6 1:19 Success 0 3

p7 0:56 Success 0 5

p8 3:00 Time Out 5 1

p9 3:03 Fail 7 1

p10 2:11 Success 3 3

Expected Time 3:00

Average Time on Task 1:28

Average Task Satisfaction 2.29

Average #Path

Deviations 2.86

Percent Success 43%

Participant

Number

Task 14 (Adj. of Drug-Drug intervention)

Task Time Outcome

# Path

Deviations

Task

Satisfaction

p1

p2

p3 2:31 Assisted 2 3

p4 1:42 Success 2 5

p5

p6 3:58 Assisted 4 1

p7

p8 2:13 Time Out 4 2

p9 1:21 Success 1 2

p10 2:10 Success 3 4

Expected Time 2:00

Average Time on Task 1:44

Average Task Satisfaction 2.8

Average #Path

Deviations 2.67

Percent Success 50%

Page 47: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

Participant

Number

Task 15 (View drug-drug intervention)

Task Time Outcome

# Path

Deviations

Task

Satisfaction

p1

p2

p3 6:02 Time Out 4

p4 5:26 Success 0 4

p5

p6 4:37 Success 3 3

p7 3:57 Success 1 5

p8 5:03 Success 1 3

p9 7:54 Fail 5 3

p10 4:07 Success 2 1

Expected Time 4:00

Average Time on Task 4:38

Average Task Satisfaction 3.29

Average #Path

Deviations 2.00

Percent Success 71%

Participant

Number

Task 16 (Medication order - change)

Task Time Outcome

# Path

Deviations

Task

Satisfaction

p1

p2

p3 0:27 Success 0 4

p4 1:41 Success 2 3

p5

p6 0:57 Success 1 4

p7 1:15 Success 0 5

p8 0:41 Success 0 5

p9 1:34 Success 0 5

p10 0:54 Success 0 3

Expected Time 2:00

Average Time on Task 1:04

Average Task Satisfaction 4.14

Average #Path

Deviations 0.43

Percent Success 100%

Page 48: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of
Page 49: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

Oncology Systems 3100 Hansen Way Palo Alto, CA 94304 USA tel +1 650 493 4000 www.varian.com

June 20, 2014 Dear Drummond Group, This letter is in response to the certification for the ARIA® Oncology Information System (for Medical Oncology) product from Varian Medical Systems. This letter applies to the certification criteria 170.314.d.2 concerning the auditable events and tamper resistant measure. For public release: Varian Medical Systems attests to the validity of the information below to satisfy the documentation requirements for testing and certification of the ONC 2014 Edition criteria: 170.314(d)2.

1. Does the EHR SUT allow the following? • Disabling the audit log No, the EHR System does not allow for a user to disable the audit logging. • monitoring and recording of audit log status changes (if disabling is possible) There is no monitoring or recording of audit log status changes. • monitoring and recording of status changes to encryption, if encryption is used to satisfy

the end user device encryption (d)7 criteria The encryption cannot be disabled by the end user, as a result, there is no monitoring or recording of status changes to encryption. [IN170.314(d)(2)-2.02 / IN170.314(d)(2)-2.09]

2. If the audit log can be disabled, is the default state for audit log and audit log status

recording enabled by default? The EHR System does not allow for a user to disable the audit logging. [IN170.314(d)(2)-1.01-1.02]

3. If applicable, and if the EHR also allows it to be disabled, is the encryption of electronic health information on end-user devices enabled by default? Yes, the encryption of electronic health information is on the end-user devices by default and cannot be modified by the end user. The EHR System does not store electronic health information on the end-user device. [IN170.314(d)(2)-1.03]

4. Does the EHR SUT permit any users to delete electronic health information? No, users cannot delete any electronic health information. [IN170.314(d)(2)-3.03]

Page 50: User Centered Design Process - Drummond Group › pdfs › Appendix_20160919... · study tested and validated the usability of the current user interface and provides evidence of

5. Does the EHR SUT audit logging capability monitor each of the required actions for all

instances of electronic health information utilized by the EHR SUT in accordance with the specified standard ASTM E2147-01? Yes, the audit logging capability can monitor each of the required actions for all instances of the electronic health information. Even though users cannot delete health information, there is an error capability that will fall under the change category. [IN170.314(d)(2)-3.04]

6. Describe the method(s) through which the audit logs are protected from being changed, overwritten, or deleted by the EHR technology itself. There is no functionality within the EHR to modify or delete audit logs. The audit logs are automatically updated by the EHR. [IN170.314(d)(2)-4.01]

7. Describe the method(s) through which the EHR SUT is capable of detecting whether the audit logs have been altered. NOTE – This type of alteration would be from outside the EHR (e.g. hacking, manual tampering, other software besides the EHR). The EHR is capable of detecting altered audit logs by the following means explained in this section. Each row in the audit log contains a checksum to detect if any changes are made via external (to the EHR) means. Each patient’s set of audit log records has a checksum value to detect any deletions to the audit log. The checksum values are inspected at three different points to ensure there has been no data tampering. 1) when the audit logs are viewed by a user 2) via a scheduled database job 3) when updated, for the patient level checksum [IN170.314(d)(2)-5.01]

I hereby attest that all above statements are true, as an authorized signing authority on behalf of my organization. Regards, _________________________________________________________________ Senior Management approval signature and date signed Sukhveer Singh Vice President and General Manager of OCS

June 20th, 2014