1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through...

66

Transcript of 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through...

Page 1: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

PointC|ickCare°

September, 15”‘ , 2016

Sonia Galvan Drummond Group

Dear Sonia,

Regarding Test Criteria: 170.314.g.3 — Safety Enhanced Design, PointClickCare follows an Agile Development process that takes into account an embedded UCD process from guidelines within the ISO 9241 part 11 and ISO 13407. The application user experience attains high satisfaction and safety levels in context of the criteria through the development cycle and also through a continuous tracking of usage over a number of years for which it has been in production. NPS satisfaction scores are monitored regularly and it emphasizes the usability and safety of the product.

The UCD process is embedded within our software development cycle and is as follows:

1) Analysis of business requirements is performed to address the user requirements and user experience through storyboards and use case scenarios. Specifications are written to clearly articulate what is to be built with appropriate screen mock—ups and wireframes where necessary. The targeted users and expectation are specified at this stage of development life cycle.

2) Building the context of use is established and checkpoints that are visual displayed and demonstrated occur at the completion of a sprint within the development cycle. Appropriate stakeholders will attend the end of sprint checkpoints and would include Engineering, QA, Business Analysts, Technical Product Managers and Product Specialists. Any findings and revalidation would occur at this time and addressed in the following sprint as needed. A dedicated UX architect is also assigned to be an integral part of this development phase to ensure consistency and UI standards are being followed.

3) Quality Assurance is an integral part of the agile team and they enforce the user interface complies with the specification, Ul standards and safety. A ticketing system is used to ensure that any findings is tracked and resolved before the end of the development cycle.

4) Beta/pilot testing is a phase in the development cycle where the product is ready to have the interface and its functions tested for feedback from customers and other external organizations. During this period all findings in the interface are logged in the ticketing system and categorized appropriately for fixes before GA or in future versions. Safety check is of utmost importance and such issues after flagged as a must fix before GA. It is also where monitoring of the actual customer user experience is started and tracked.

You H @pointclickcare H /pointclickcare pointc|ickcareEHR I www.pointc|ickcare.com '' success@pointc|ickcare.com ° 1.800.277.5889

Page 2: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

Attestation of Quality Management System — September, 2016

5) A further tracking of the user experience is through an Early Adopter (EA) program where production level deployment is done in a limited scale for an observatory time period. Finding on any user centric interfaces are also logged in the ticketing systems and evaluated as a must fix for GA or in a future version.

6) Once the product is made GA, continuous tracking of usage and improvement are made through issue tickets logged directly from our customers and the use of an NPS survey, which include usability and interface related issues. Such tickets are prioritized for service packs or future releases as necessary.

Sincerely,

Meganiy/.\n lo

VP, Platform Product Management

905-858-8885, 460 Megan.d@pointc|ickcare.com

Pag” PointC|ickCare”

Page 3: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

1 | P a g e

Usability Test Report

Testing for:

§170.314(a)(1) Computerized Provider Order Entry (CPOE)

§170.314(a)(6) Maintain active medication list

§170.314(a)(7) Maintain an active medication allergy list

Page 4: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

2 | P a g e

EHR Usability Test Report of PointClickCare v3.7.1.2

Date of Usability Test: S eptember 26, 2013 Date of Report: September 27, 2013 Last Updated: November 2, 2016 Report Prepared By: PointClickCare

Contents EXECUTIVE SUMMARY ...................................................................................................................................... 3

INTRODUCTION ................................................................................................................................................. 5

METHOD ............................................................................................................................................................ 5

PARTICIPANTS ............................................................................................................................................... 5

STUDY DESIGN ............................................................................................................................................... 6

TASKS ............................................................................................................................................................. 6

PROCEDURES ................................................................................................................................................. 7

TEST LOCATION ............................................................................................................................................. 7

TEST ENVIRONMENT ..................................................................................................................................... 8

TEST FORMS AND .......................................................................................................................................... 8

PARTICIPANT INSTRUCTIONS ........................................................................................................................ 8

USABILITY METRICS ....................................................................................................................................... 9

DATA SCORING ............................................................................................................................................ 10

RESULTS ........................................................................................................................................................... 11

DATA ANALYSIS AND REPORTING ............................................................................................................... 11

DISCUSSION OF THE FINDINGS ........................................................................................................................ 12

Appendix 1: SAMPLE RECRUITING SCREENER ............................................................................................. 14

Appendix 2: PARTICIPANT DEMOGRAPHICS................................................................................................ 16

Appendix 3: MODERATOR’S GUIDE ............................................................................................................ 17

Appendix 4: SYSTEM USABILITY SCALE QUESTIONNAIRE ........................................................................... 29

Page 5: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

3 | P a g e

EXECUTIVE SUMMARY

A usability test of PointClickCare v3.7.1.2 Long-Term Post-Acute Care Electronic

Health Record (EHR) was conducted on September 27, 2013 by PointClickCare.

The testing was conducted with remote testers through a WebEx meeting. The

purpose of the testing was to validate the usability of the current user interface,

and provide evidence of usability in the EHR under test (EHRUT). During the

usability test, 2 healthcare providers and 1 healthcare business analyst who match

the target demographic criteria served as participants and used the EHRUT in

simulated but representative tasks.

This study collected performance data on 9 tasks typically conducted in an EHR:

First impression on application (General task; no need to execute)

Enter a medication allergy and review the allergy list to view/validate (Medication Allergy List)

Edit the allergy and review the allergy list to view/validate the change (Medication Allergy List)

Add a medication and review the medication list to view/validate (Medication List)

Edit the medication and review the medication list to view/validate the change (Medication List)

Add a medication that is on the medication allergy list and identify system warning of the allergy (Medication List)

Add a radiology order and edit radiology order and view/validate the change (CPOE)

Add a lab order and edit lab order and view/validate the change (CPOE)

Add a medication order, Edit medication order and view/validate the change (CPOE)

During the 2-hour usability test, each participant was greeted by the administrator

and introduced to the goals of the testing. Participants had prior experience with

the EHR. They had all been working with the EHR in their facilities and had

previously received end user training when the application was introduced to

their facility or when they started their employment. The administrator

introduced the test and instructed the participants to complete a series of tasks,

one at a time, using the EHRUT. During the test, the administrator and the data

logger recorded the user performance and collected the user comments. No

assistance was given regarding how to complete the tasks.

The following data was collected for each of the participants:

Participant's verbalizations

Participant's satisfaction with the system

Page 6: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

4 | P a g e

All participant data was de-identified – no correspondence could be made from the

identity of the participant to the data collected. Various recommended metrics, in

accordance with the examples set forth in the NIST Guide to the Processes

Approach for Improving the Usability of Electronic Health Records, were used to

evaluate the usability of the EHRUT. Following is a summary of the performance

and rating data collected on the EHRUT.

Task N Task Success Task Deviation

Errors

Enter a medication allergy

3 100% 0% 0%

Correct a medication allergy

3 100% 0% 0%

Enter a medication in the chart

3 100% 0% 0%

Correct a medication entry

3 100% 0% 0%

Add a medication the patient has an allergy to

3 100% 0% 0%

Enter and edit a radiology order

3 100% 0% 0%

Enter and edit a lab order

3 100% 0% 0%

Enter and edit a medication order

3 100% 0% 0%

The results from the SUS (System Usability Scale) scored the subjective satisfaction with the system based on performance with these tasks to be: 80 (This represent the average of 5 testers. This calculation is based on the answers by testers to Usability Scale Questionnaire. There are total of 10 questions with a scale from 1 to 5 [Maximum total of 50 points; 5 points per question. Both Maximum total points and points provided by each individual were multiplied by 2 to get the result out of 100]). In addition to the performance data, the following qualitative observations were

made:

Major findings:

Areas identified as needing improvement

While adding the new allergy if the allergen is not in Medispan or custom library, the system is clearing any additional data entered/selected for new allergy (ex: Allergy Type) Need definitions clarified for:

o Allergy Severity(High/Moderate/Low)

o Allergy Types (Intolerance/Allergy)

Page 7: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

5 | P a g e

Asterisks to denote required fields are not readily visible

Updating the allergy type clears out the allergen entered in the “Allergen” Textbox without any warning to user.

Areas identified as positive

Users feel the change from an EMAR that does not allow a user to delete and make a copy of a medication to one that does will prevent possible errors of omission.

INTRODUCTION

The EHRUT tested for this study was PointClickCare v3.7.1.2. Designed to present

medical information to healthcare providers in Long-Term Post-Acute Care

Facilities, PointClickCare deploys a single integrated EHR platform to all of our

2,000+ customers, through a multi-tenancy Software-as-a-Service (SaaS)

technology model. The PointClickCare integrated EHR platform is an extremely

flexible and scalable platform designed to meet the dynamic demands of Long-

Term Post-Acute Care. By connecting clinical, billing, and administration processes

across a single, web-based platform, information is only entered once, duplicate

documentation is avoided, staff have more time to spend with residents, errors

are reduced, and RUGs are optimized.

The PointClickCare application is a Java-based web application. The overall

technology stack constitutes a presentation layer, a business layer, and a data

layer. Specific technologies used are: Java JDK, Spring Framework, Microsoft SQL

Server, and Caucho Resin application server.

METHOD

PARTICIPANTS

A total of 3 participants were tested on the EHRUT. Participants in the test were

clinical and administrative staff from Long-Term Post-Acute Care facilities.

Participants were recruited by PointClickCare and were not compensated for their

time. In addition, participants had no direct connection to the development of, or

organization producing, the EHRUT. Participants were not the employees of EHR

vendor or supplier organization and were actual users who had the same

orientation and level of training as all users have received.

For the test purposes, end-user characteristics were identified and translated into

a recruitment screener used to solicit potential participants; an example of a

screener is provided in Appendix 1

Page 8: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

6 | P a g e

The following is a table of participants by characteristics, including: demographics,

professional experience, computing experience, and user’s need for assistive

technology. Participant names were replaced with participant IDs so that an

individual’s data could not be tied back to an individual identity.

Part

ID

Gender Age Educatio

n

Occupation Professional

Experience

Computer

Experience

Product

Experience

Assistive

Technology

Needs 1 F 40-59 College RN/ADON 25+ Extensive 5years None 2 F 40-59 College ADON 25+ Extensive 4 years None 3 F 40-59 College Business Analyst 25+ Extensive 5 years None

Three participants (matching the demographics in the recruitment screener) were

recruited and three participated in the usability test. None of the participants

failed to engage in the study.

Participants were scheduled for one-hour sessions.

STUDY DESIGN

Overall, the objective of this test was to uncover areas where the application

performed well – that is, effectively, efficiently, and with satisfaction – and areas

where the application failed to meet the needs of the participants. The data from

this test may serve as a baseline for future tests with an updated version of the

same EHR and/or comparison with other EHRs, provided the same tasks are

performed. In short, this testing serves as both a means to record or benchmark

current usability, but also to identify areas where improvements must be made.

During the usability test, participants interacted with one EHR. Each participant

used the system in their individual location, and was provided with the same

instructions. The system was evaluated for effectiveness, efficiency and

satisfaction as defined by measures collected and analyzed for each participant:

Path deviations

Participant’s verbalizations (comments)

Participant’s satisfaction ratings of the system Additional information about the various measures can be found in the section on

Usability Metrics.

TASKS

A number of tasks were constructed that would be realistic and representative of

the kinds of activities a user might perform with this EHR, including:

Page 9: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

7 | P a g e

1. Enter and edit an order for medications

2. Enter and edit an order for Lab tests

3. Enter and edit an order for radiology study

4. Add/Edit and view medication allergies

5. View Medication list Tasks were selected based on their frequency of use, criticality of function, and

those that might be most troublesome for users.

PROCEDURES

As users were located across the country, the test was performed through an

online meeting tool where the users and the test administrators could both talk

and see each other's computer screens.

To ensure that the test ran smoothly, two staff members participated in this test,

the usability administrator and the data logger. The usability testing staff members

who conducted the test were experienced clinical application product and

program managers.

The administrator moderated the session, including administering instructions and

tasks. The administrator also monitored task times, obtained post-task rating

data, and took notes on participant comments. A second person served as the data

logger and took notes on task success, path deviations and comments.

Participants were instructed to perform the tasks (see specific instructions below):

As quickly as possible making as few errors and deviations as possible,

Without assistance; administrators were allowed to give immaterial guidance and clarification on tasks, but not instructions on use,

Without using a “think aloud” technique. For each task, the participants were given verbal instructions about how to

complete the task by using the system.

Following the session, the administrator gave the participant the post-test

questionnaire (i.e., the System Usability Scale, see Appendix 4)

Participants' demographic information, task success rate, deviations, verbal

responses, and post-test questionnaire responses were recorded into a

spreadsheet.

Participants were thanked for their time.

TEST LOCATION

Page 10: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

8 | P a g e

Each study participant performed the testing at their employment site in a quiet

location. Communication with the participants was through an online meeting

(WebEx) so the participant and the test administrator were able to communicate

and view each other's screens.

TEST ENVIRONMENT

The EHR would typically be used in a healthcare office or facility. In this instance,

the EHRUT was conducted in each tester’s place of employment. For testing, the

computer used was a desktop computer running a Windows operating system. The

participants used a mouse and keyboard when interacting with the EHRUT. The

PointClickCare application used the display that was standard at the tester’s home

office facility. The application was set up by the tester’s employer according to the

vendor’s (PointClickCare) documentation describing the system set-up and

preparation. The application itself was running on a Windows platform using a

testing/training database. Technically, the system performance (i.e., response

time) was representative of what actual users would experience in a field

implementation. Additionally, participants were instructed not to alter any of the

default system settings (such as control of font size).

TEST FORMS AND TOOLS

During the usability test, various documents and instruments were used, including:

1. Moderator’s Guide

2. Post-Test Questionnaire

Examples of these documents can be found in Appendices 3-4 respectively. The

Moderator’s Guide was devised so as to be able to capture requisite data.

PARTICIPANT INSTRUCTIONS

The administrator read the following instructions aloud to each participant (also

see the full Moderator’s Guide in Appendix 3:

Thank you for participating in this study. Your input is very important. Our

session today will last about 60 minutes. During that time, you will use an

instance of an electronic health record.

I will ask you to complete a few tasks using this system and answer some

questions. You should complete the tasks as quickly as possible making as few

errors as possible. Please try to complete the tasks on your own following the

instructions very closely. Please note that we are not testing you, we are

testing the system; therefore, if you have difficulty, all this means is that

something needs to be improved in the system. I will be here in case you need

Page 11: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

9 | P a g e

specific help, but I am not able to instruct you or provide help in how to use the

application.

Overall, we are interested in how easy (or how difficult) this system is to use,

what in it would be useful to you, and how we could improve it. I did not have

any involvement in its creation, so please be honest with your opinions. All of

the information that you provide will be kept confidential and your name will

not be associated with your comments at any time. Should you feel it

necessary, you are able to withdraw at any time during the testing.

Following the procedural instructions, participants were shown the EHR and, as

their first task, were given time to explore the system and make comments;

however, as they were all familiar with the system, this offer was declined. Once

this task was complete, the administrator gave the following instructions:

For each task, I will read the description to you and say “Begin.” At that point,

please perform the task and say “Done” once you believe you have

successfully completed the task. I would like to request that you not talk aloud

or verbalize until you are done

Participants were then given 9 tasks to complete. Tasks are listed in the

Moderator’s Guide in Appendix 3.

USABILITY METRICS

According to the NIST Guide to the Processes Approach for Improving the

Usability of Electronic Health Records, EHRs should support a process

that provides a high level of usability for all users. The goal is for users to

interact with the system effectively, efficiently, and with an acceptable

level of satisfaction. To this end, metrics for effectiveness, efficiency and

user satisfaction were captured during the usability testing.

The goals of the test were to assess:

1. Effectiveness of PointClickCare by measuring participant

success rates and errors

2. Efficiency of PointClickCare by measuring the average task

time and path deviations

3. Satisfaction with PointClickCare by measuring ease of use ratings

Page 12: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

10 | P a g e

DATA SCORING

The following table details how tasks were scored and how errors were evaluated:

Measures Rationale and Scoring Effectiveness:

Task Success

A task was counted as a “Success” if the participant was able to achieve the

correct outcome, without assistance, within the time allotted on a per task

basis.

The total number of successes were calculated for each task and then divided

by the total number of times that task was attempted. The results were

provided as a percentage.

Effectiveness:

Task Failures

If the participant abandoned the task, did not reach the correct answer,

performed it incorrectly, or reached the end of the allotted time

before successful completion, the task was counted as a “Failure.” No task

times were taken for errors.

The total number of errors was calculated for each task and then divided by the

total number of times that task was attempted. Not all deviations would be

counted as errors. This was expressed as the mean number of failed tasks per

participant. On a qualitative level, an enumeration of errors and error types

should be collected.

Efficiency:

Task

Deviations

The participant’s path (i.e., steps) through the application was recorded.

Deviations occurred if the participant, for example, went to a wrong screen,

clicked on an incorrect menu item, followed an incorrect link, or interacted

incorrectly with an on-screen control. This path was compared to the optimal

path. The number of steps in the observed path was divided by the number of

optimal steps to provide a ratio of path deviation.

Satisfaction:

Task Rating

Each participant’s subjective impression of the ease of use of the application was

measured by administering both a simple post-task question as well as a post-

session questionnaire. After each task, the participant was asked to rate “Overall,

this task was:” on a scale of 1 (Very Difficult) to 5 (Very Easy). These data were

averaged across participants.

Common convention is that average ratings for systems judged easy to use

should be 3.3 or above.

To measure participants’ confidence in, and likeability of, PointClickCare overall,

the testing team administered the System Usability Scale (SUS) post-test

questionnaire. Answer options included, “I think I would like to use this system

frequently,” “I thought the system was easy to use,” and “I would imagine that

most people would learn to use this system very quickly.” See full System

Usability Score questionnaire in Appendix 4.

Page 13: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

11 | P a g e

RESULTS

DATA ANALYSIS AND REPORTING

The results of the usability test were calculated according to the methods

specified in the Usability Metrics section above. Participants who failed to follow

session and task instructions had their data excluded from the analyses. There

were no deletions.

The usability testing results for the EHRUT are detailed below.

Task N Task Success Task Deviation

Errors

Enter a medication allergy

3 100% 0% 0%

Correct a medication allergy

3 100% 0% 0%

Enter a medication in the chart

3 100% 0% 0%

Correct a medication entry

3 100% 0% 0%

Add a medication the patient has an allergy to

3 100% 0% 0%

Enter and edit a radiology order

3 100% 0% 0%

Enter and edit a lab order

3 100% 0% 0%

Enter and edit a medication order

3 100% 0% 0%

The results from the SUS (System Usability Scale) scored the subjective

satisfaction with the system based on performance with these tasks to be: 80

(This represent the average of 5 testers. This calculation is based on the answers by

testers to Usability Scale Questionnaire. There are total of 10 questions with a scale

from 1 to 5 [Maximum total of 50 points; 5 points per question. Both Maximum

total points and points provided by each individual were multiplied by 2 to get the

result out of 100]). Broadly interpreted, scores under 60 represent systems with

poor usability, while scores over 80 are considered to be above average.

Page 14: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

12 | P a g e

DISCUSSION OF THE FINDINGS

EFFECTIVENESS

It appears that the application is effective for these testers. It might merit further testing in the future, with newer users or those not accustomed to the system, to determine how intuitive the user interface is.

EFFICIENCY

Although these testers were not timed, they completed each of the tasks rapidly and without errors. Therefore, it can be assumed that, for users who are acclimated to the system, they are able to perform tasks with high efficiency.

SATISFACTION

Overall, the testers are satisfied with the application and its functionality. They are users who are in the application frequently to manage the care of their residents. One of the users, although not clinical, is responsible for supporting her clinical team and functions as the system administrator and, therefore, is not only knowledgeable about the application but also how many users work in it.

MAJOR FINDINGS

It was clear that the allergies could be re-evaluated with consideration given to how to provide feedback the user regarding the actual selections and what they quantitatively mean. This would help bring consistency to the data reported and help avoid confusion on the part of the user. These findings will be evaluated and placed on the backlog for inclusion in a future release.

AREAS FOR IMPROVEMENT

Some of the displays related to supplemental information need to be reviewed such as the placement of asterisks for mandatory fields.

These will be evaluated for a future release.

Page 15: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

13 | P a g e

APPENDICES

The following appendices include supplemental data for this usability

test report. Following is a list of the appendices provided:

1: Sample Recruiting Screener

2: Participant Demographics

3: Example Moderator’s Guide

4: System Usability Scale Questionnaire

Page 16: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

14 | P a g e

Appendix 1: SAMPLE RECRUITING SCREENER

1. Recruiting

We are recruiting individuals to participate in a usability study for an electronic health record. We would like to ask

you a few questions to gather participation information. This should only take a few minutes of your time. This is

strictly for research purposes. You will not be paid to participate in Training and Testing.

Please answer the following questions:

1. Are you male or female? Male Female

2. Have you participated in a focus group or usability test in the

past 6 months? Yes No

3. Do you, or does anyone in your home, work in marketing research, usability research, web design, or

other computer work? Yes No

4. Do you, or does anyone in your home, have a commercial or research interest in an electronic health

record software or consulting company? Yes No

5. Which of the following best describes your age?

23 to 39 40 to 59 60 to 74 75 or older

6. Which of the following best describes your race or ethnic group?

Caucasian Asian Black/African American

Latino/a or Hispanic Other

7. Do you require any assistive technologies to use a computer? Yes No

If Yes, Please Describe:

Professional Demographics:

8. What is your current position and title? (Must be healthcare related)

_____RN: Specialty

_____Physician: Specialty

_____Resident: Specialty

_____Administrative Staff

_____Other Medical: Specialty

9. How long have you held this position?

Page 17: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

15 | P a g e

10. Describe your work location (or affiliation) and environment [e.g., private practice, health system,

government clinic, etc.]

11. Which of the following describes your highest level of education?

High School/GED Some College College Graduate (RN/BSN)

Postgraduate (MD/PhD) Other: Please Describe:

2. Recruiting

Computer Expertise:

12. Besides reading email, what professional activities do you do on the computer?

Access EHR Research News Shopping/Banking Digital

Pictures Programming Microsoft Office Products

13. About how many hours per week do you spend on the computer?

0 to 10 11 to 25 26 or More

14. What computer platform do you usually use? [e.g., Mac, Windows, etc.]

15. What Internet browser(s) do you usually use? [e.g. Firefox, IE, AOL, etc.]

16. In the last month, how often have you used an electronic health record?

17. How many years have you used electronic health records?

18. How many EHRs do you use or are you familiar with?

19. How does your work environment record/retrieve patient records?

On Paper Some Paper/Some Electronic All Electronic

Contact Information: Name of participant:

Address:

City: State: Zip:

Daytime phone: Evening Phone: Cell phone:

Email address:

Page 18: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

16 | P a g e

Appendix 2: PARTICIPANT DEMOGRAPHICS

The report should contain a breakdown of the key participant demographics. A representative list is shown

below.

Following is a high-level overview of the participants in this study.

Gender

Men 0

Women 3

Total (participants) 3

Occupation/Role

RN/LVN 2

Physician 0

Admin Staff 1

Total (participants) 3

Years of Experience

Combined

Years of experience with Health

Records 31

Page 19: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

17 | P a g e

Appendix 3: MODERATOR’S GUIDE

Nine tasks are presented here for illustration.

EHRUT Usability Test Moderator’s Guide

Administrator Janelle Miller

Data Logger Jane Niemi

Date 9/26/2013 Time 1:00PM

Participant # 3

Location Remote

Prior to testing

☐ Confirm schedule with participant

☐ Ensure participant ’s environment is running properly

Prior to each participant: ☐ Reset application ☐ Start session recordings with tool

Prior to each task: ☐ Reset application to starting point for next task

Orientation (5 minutes)

Thank you for participating in this study. Our session today will last 60 minutes.

During that time, you will take a look at an electronic health record system.

I will ask you to complete a few tasks using this system and answer some questions. We

are interested in how easy (or how difficult) this system is to use, what in it would be

useful to you, and how we could improve it. You will be asked to complete these tasks

on your own, trying to do them as quickly as possible, with the fewest possible errors or

deviations. Do not do anything more than asked. If you get lost or have difficulty, I

cannot help you with anything to do with the system itself. Please save your detailed

Page 20: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

18 | P a g e

comments until the end of a task or the end of the session as a whole, when we can

discuss freely. I did not have any involvement in its creation, so please be honest with

your opinions.

The product you will be using today is the production version of the PointClickCare application.

Do you have any questions or concerns?

Preliminary Questions (5 minutes)

What is your job title / appointment?

How long have you been working in this role?

What are some of your main responsibilities?

Tell me about your experience with electronic health records.

Page 21: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

19 | P a g e

Task 1: First Impressions (120 Seconds)

This is the application you will be working with. Have you heard of it? If so, tell me what you know about it.

☒ Yes ☐ No

Show test participant the EHRUT.

Please don’t click on anything just yet. What do you notice? What are you able to do here? Please be specific.

Administrator /Note Taker Comments: All three of the testers were familiar with the application and had either trained on it at an earlier time or used it on a daily basis.

Page 22: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

20 | P a g e

Task 2: Enter a medication allergy (120 Seconds) (Medication Allergy List)

Take the participant to the starting point for the task.

When taking the resident’s history, you want to document the fact that they have an allergy to Penicillin. Be sure to check the medication allergy list of the resident to be sure it is correctly documented.

Success: ☒ Easily completed

☐ Completed with difficulty or help: Describe below

☐ Not completed Comments:

Observed Errors and Verbalizations: Comments: No errors

Rating: Overall, this task was: 2

Show participant written scale: “Very Easy” (1) to “Very Difficult” (5)

Administrator / Note Taker Comments:

Testers all noted that this was easy and something with which they were familiar.

They felt that it was easy for them because they had been trained in the system.

They felt that it would be more helpful if there were some more clear definitions about some of the fields, such as Allergy types. There was too much confusion regarding what constitutes an allergy as compared to an intolerance.

They also felt the definition of Mild/Moderate/Severe needed clearer definitions, which were visible to the user (as each person could potentially define that differently).

Also, if the users started to enter an allergy and had typed it in but then realized that the allergy type was incorrect, when they changed the allergy type the data that they had initially entered was removed.

The users felt the asterisks which denote mandatory fields could be better placed, as they were not intuitively seen by the user.

Page 23: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

21 | P a g e

Task 3: Correct a medication allergy entry on a chart (4 minutes) (Medication Allergy List)

Take the participant to the starting point for the task.

Now you realize you documented the medication allergy on the incorrect chart. You need to go into the record and strikeout the medication allergy and then document it in the correct chart. Be sure to check the allergy lists of both patients to be sure they are documented correctly.

Success:

☒ Easily completed

☐ Completed with difficulty or help: Describe below

☐ Not completed Comments: No comments Observed Errors and Verbalizations:

Comments: No errors

Rating: Overall, this task was: 2 Administrator / Note Taker Comments: No comments from users. Task was completed easily.

Page 24: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

22 | P a g e

Task 4: Enter a medication on a chart (3 minutes) (Medication List)

Take the participant to the starting point for the task.

Enter a medication, Simvastatin 10 mg., PO QD in the chart of a resident. Check the medication list to be sure it is listed correctly.

Success: ☒ Easily completed

☐ Completed with difficulty or help: Describe below

☐ Not completed Comments: See below

Observed Errors and Verbalizations: Comments: No errors

Rating: Overall, this task was: 2

Administrator / Note Taker Comments: Task was easily completed by the users. One user was on an older version of the software that handled changing medications differently. She was very pleased that the system no longer allows the user to copy a medication to change it, as she felt it was too easy to make a transcription error. She stated that this was a significant improvement over the previous version.

Page 25: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

23 | P a g e

Task 5: Correct a medication entry (4 minutes) (Medication List)

Take the participant to the starting point for the task.

You realize you entered the medication dose incorrectly and it should be 20 Mg, not 10 Mg. Go in the chart and correct the dosage. Be sure to check the medication list to be sure it is correct.

Success:

☒ Easily completed

☐ Completed with difficulty or help: Describe below

☐ Not completed Comments: No comments

Observed Errors and Verbalizations: Comments: No errors

Rating: Overall, this task was: 2

Administrator / Note Taker Comments: No comments from users. Task was completed easily.

Page 26: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

24 | P a g e

Task 6: Enter a medication that the patient has an allergy to. (120 Seconds) (Medication List) (Medication Allergy List)

Take the participant to the starting point for the task.

Enter a medication, Penicillin, on the same patient for which you documented a medication allergy for penicillin. What is the result?

Success:

☒ Easily completed

☐ Completed with difficulty or help: Describe below

☐ Not completed Comments: No comments

Observed Errors and Verbalizations: Comments: No errors

Rating: Overall, this task was: 1

Administrator / Note Taker Comments: No comments from users. Task was completed easily.

Page 27: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

25 | P a g e

Task 7: Enter a physician’s order for a CT of head/brain (120 Seconds) (CPOE)

Take the participant to the starting point for the task.

Enter a physician’s order for a CT scan of the head/brain and then filter the orders to be sure it is there.

Success:

☒ Easily completed

☐ Completed with difficulty or help: Describe below

☐ Not completed Comments: No comments

Observed Errors and Verbalizations: Comments: No errors

Rating: Overall this task was: 1

Administrator / Note Taker Comments: No comments from users. Task was completed easily.

Page 28: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

26 | P a g e

Task 8: Enter a physician’s order for a fasting Glucose (120 Seconds) (CPOE)

Take the participant to the starting point for the task. Enter an order for a weekly fasting blood glucose lab test. Filter the orders to be sure that the weekly order is there.

Success:

☒ Easily completed

☐ Completed with difficulty or help: Describe below

☐ Not completed Comments: No comments

Observed Errors and Verbalizations: Comments: No errors

Rating: Overall, this task was: 1

Administrator / Note Taker Comments: No comments from users. Task was completed easily.

Page 29: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

27 | P a g e

Task 9: Enter a physician’s order for a medication (120 Seconds) (CPOE)

Take the participant to the starting point for the task.

Enter a physician’s order for a medication and then filter the orders to be sure it is there.

Success:

☒ Easily completed

☐ Completed with difficulty or help: Describe below

☐ Not completed Comments: No comments

Observed Errors and Verbalizations: Comments: No errors

Rating: Overall, this task was: 1

Administrator / Note Taker Comments: No comments from users. Task was completed easily.

Page 30: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

28 | P a g e

Final Questions (10 Minutes)

What was your overall impression of this system? Easy to use What aspects of the system did you like most? What aspects of the system did you like least?

No issues except for the Allergies, as previously stated.

Were there any features that you were surprised to see?

None

What features did you expect to encounter but did not see? That is, is there anything that is missing in this application? Would like more clarity around definitions, so they are readily visible to the user.

Compare this system to other systems you have used. Would you recommend this system to your colleagues? No comparisons

Page 31: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

29 | P a g e

Appendix 4: SYSTEM USABILITY SCALE QUESTIONNAIRE

In 1996, Brooke published a “low-cost usability scale that can be used for global assessments of systems

usability” known as the System Usability Scale or SUS Lewis and Sauro (2009) and others have

elaborated on the SUS over the years. Computation of the SUS score can be found in Brooke’s paper, at

http://www.usabilitynet.org/trump/documents/Suschapt.doc or in Tullis and Albert (2008). Tester 1

1. I think that I would like to use this

system frequently

2. I found the system unnecessarily

complex

3. I thought the system was easy

to use

4. I think that I would need the

support of a technical person to

be able to use this system

5. I found the various functions in

this system were well integrated

6. I thought there was too much

inconsistency in this system

7. I would imagine that most people

would learn to use this system

very quickly

8. I found the system very

cumbersome to use

9. I felt very confident using the

system

10. I needed to learn a lot of

things before I could get going

with this system

Strongly Strongly

disagree agree

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

Page 32: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

30 | P a g e

Tester 2

1. I think that I would like to use this

system frequently

2. I found the system unnecessarily

complex

3. I thought the system was easy

to use

4. I think that I would need the

support of a technical person to

be able to use this system

5. I found the various functions in

this system were well integrated

6. I thought there was too much

inconsistency in this system

7. I would imagine that most people

would learn to use this system

very quickly

8. I found the system very

cumbersome to use

9. I felt very confident using the

system

10. I needed to learn a lot of

things before I could get going

with this system

Strongly Strongly

disagree agree

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

Page 33: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

31 | P a g e

Tester 3

1. I think that I would like to use this

system frequently

2. I found the system unnecessarily

complex

3. I thought the system was easy

to use

4. I think that I would need the

support of a technical person to

be able to use this system

5. I found the various functions in

this system were well integrated

6. I thought there was too much

inconsistency in this system

7. I would imagine that most people

would learn to use this system

very quickly

8. I found the system very

cumbersome to use

9. I felt very confident using the

system

10. I needed to learn a lot of

things before I could get going

with this system

Strongly Strongly

disagree agree

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

Page 34: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able
Page 35: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

1 | P a g e

Usability Test Report

Testing for:

§170.314(a)(2) Drug-drug, drug-allergy interaction checks

§170.314(a)(10) Drug-formulary checks

§170.314(b)(3) Electronic prescribing

Page 36: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

2 | P a g e

EHR Usability Test Report of PointClickCare v3.7.10.2

Date of Usability Test: S eptember 16, 2016 Date of Report: September 19, 2016 Report Prepared By: PointClickCare (Wescom Solutions Inc.) - Product Management

Contents EXECUTIVE SUMMARY ...................................................................................................................................... 3

INTRODUCTION ................................................................................................................................................. 5

METHOD ............................................................................................................................................................ 5

PARTICIPANTS ............................................................................................................................................... 5

STUDY DESIGN ............................................................................................................................................... 6

TASKS ............................................................................................................................................................. 6

PROCEDURES ................................................................................................................................................. 7

TEST LOCATION ............................................................................................................................................. 8

TEST ENVIRONMENT ..................................................................................................................................... 8

TEST FORMS AND .......................................................................................................................................... 8

PARTICIPANT INSTRUCTIONS ........................................................................................................................ 8

USABILITY METRICS ....................................................................................................................................... 9

DATA SCORING .............................................................................................................................................. 9

RESULTS ........................................................................................................................................................... 11

DATA ANALYSIS AND REPORTING ............................................................................................................... 11

DISCUSSION OF THE FINDINGS ........................................................................................................................ 12

Appendix 1: SAMPLE RECRUITING SCREENER ............................................................................................. 14

Appendix 2: PARTICIPANT DEMOGRAPHICS................................................................................................ 16

Appendix 3: MODERATOR’S GUIDE ............................................................................................................ 17

Appendix 4: SYSTEM USABILITY SCALE QUESTIONNAIRE ........................................................................... 27

Page 37: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

3 | P a g e

EXECUTIVE SUMMARY

A usability test of PointClickCare v3.7.10.2 Long-Term Post-Acute Care Electronic

Health Record (EHR) was conducted on September 16, 2016 by PointClickCare.

The testing was conducted with remote testers through a WebEx meeting. The

purpose of the testing was to validate the usability of the current user interface,

and provide evidence of usability in the EHR under test (EHRUT). During the

usability test, 2 healthcare administrative and 5 clinical users were selected as

participants and used the EHRUT in simulated but representative tasks.

This study collected performance data on 7 tasks typically conducted in an EHR:

First impression on application (General task; no need to execute)

Enable Drug Allergy check under Order Configuration (Drug Allery)

Enable Drug Interaction check under Order Configuration and enable Severe, Moderate and Mild drug interactions (Drug Interaction)

Create Formulary management for pharmacy, activate formulary and add items/medication to formulary (Formulary Management)

Enter and complete/ePrescribe an order with medication to which patient has allergy and Moderate Drug Interaction (CPOE)

Remove Moderate and Minor drug interaction check from configuration (Drug Interaction)

Enter and complete/ePrescribe an order with medication which can cause Minor Drug Interaction (CPOE)

During the 1-hour usability test, each participant was greeted by the administrator

and introduced to the goals of the testing. Participants had prior experience with

the EHR. They had all been working with the EHR in their facilities and had

previously received end user training when the application was introduced to

their facility or when they started their employment. The administrator

introduced the test and instructed the participants to complete a series of tasks,

one at a time, using the EHRUT. During the test, the administrator and the data

logger recorded the user performance and collected the user comments.

The following data was collected for each of the participants:

Participant's verbalizations

Participant's satisfaction with the system

All participant data was de-identified – no correspondence could be made from the

identity of the participant to the data collected. Various recommended metrics, in

accordance with the examples set forth in the NIST Guide to the Processes

Approach for Improving the Usability of Electronic Health Records, were used to

Page 38: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

4 | P a g e

evaluate the usability of the EHRUT. Following is a summary of the performance

and rating data collected on the EHRUT.

Task N Task Success Errors

Enable Drug Allergy check under Order Configuration

2 100% 0

Enable Drug Interaction check under Order Configuration and enable Severe, Moderate and Mild drug interactions

2 100% 0

Create Formulary management for pharmacy, activate formulary and add items/medication to formulary

2 100% 0

Enter and complete/ePrescribe an order with medication to which patient has allergy and Moderate Drug Interaction

5 100% 0

Remove Moderate and Minor drug interaction check from configuration

2 100% 0

Enter and complete/ePrescribe an order with medication which can cause Minor Drug Interaction

5 100% 0

The results from the SUS (System Usability Scale) scored the subjective satisfaction

with the system based on performance with these tasks to be: 85.6 (This represent

the mean score of 5 testers. This calculation is based on the answers by testers to

Usability Scale Questionnaire. There are total of 10 questions with a scale from 1 to 5

[Maximum total of 50 points; 5 points per question. Both Maximum total points and

points provided by each individual were multiplied by 2 to get the result out of 100])

Broadly interpreted, scores under 60 represent systems with poor usability; scores

over 80 are considered to be above average. In addition to the performance data,

the following qualitative observations were made:

Major findings:

All participants were able to complete tasks easily. All participants were satisfied

with the usability of the system.

Areas of improvement:

One participant mentioned that they have to sign the order form multiple times.

Page 39: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

5 | P a g e

INTRODUCTION The EHRUT tested for this study was PointClickCare v3.7.10.2. Designed to present

medical information to healthcare providers in Long-Term Post-Acute Care

Facilities, PointClickCare deploys a single integrated EHR platform to all of our

2,000+ customers, through a multi-tenancy Software-as-a-Service (SaaS)

technology model. The PointClickCare integrated EHR platform is an extremely

flexible and scalable platform designed to meet the dynamic demands of Long-

Term Post-Acute Care. By connecting clinical, billing, and administration processes

across a single, web-based platform, information is only entered once, duplicate

documentation is avoided, staff have more time to spend with residents, errors

are reduced, and RUGs are optimized.

The PointClickCare application is a Java-based web application. The overall

technology stack constitutes a presentation layer, a business layer, and a data

layer. Specific technologies used are: Java JDK, Spring Framework, Microsoft SQL

Server and Caucho Resin application server.

METHOD

PARTICIPANTS A total of 2 administrative participants (for configuration) and 5 clinical

participants were tested on the EHRUT. Participants in the test were Clinical and

administrative staff from Long-Term Post-Acute Care facilities. Participants were

recruited by PointClickCare and were not compensated for their time. In addition,

participants had no direct connection to the development of, or organization

producing, the EHRUT. Participants were not from the testing or supplier

organization and were actual users who had the same orientation and level of

training as all users have received.

For the test purposes, end-user characteristics were identified and translated into

a recruitment screener used to solicit potential participants; an example of a

screener is provided in Appendix 1.

Recruited participants had a mix of backgrounds and demographic characteristics

conforming to the recruitment screener. The following is a table of participants by

characteristics, including demographics, professional experience, computing

experience and user’s need for assistive technology. Participant names were

replaced with participant IDs so that an individual’s data could not be tied back to

an individual identity.

Page 40: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

6 | P a g e

Part

ID

Gender Age Education Occupation Professional

Experience

Computer Experience Product

Experience

(in years)

Assistive

Technology

Needs

1 Female 60-74 RN/BSN RN 20+ EHR, News 11 None 2 Female 40-59 College Grad Admin Staff 20+ EHR, Office Suite 20+ None 3 Female 40-59 MBA RN 20+ EHR, Banking 10+ None 4 Female 40-59 RN/BSN Admin 20+ News, Programming 10+ None 5 Male 40-59 MBA Admin Staff 20+ EHR, Programming 2 None

None of the participants failed to engage in the study.

Participants were scheduled for one-hour sessions.

STUDY DESIGN Overall, the objective of this test was to uncover areas where the application

performed well – that is, effectively, efficiently, and with satisfaction – and areas

where the application failed to meet the needs of the participants. The data from

this test may serve as a baseline for future tests with an updated version of the

same EHR and/or comparison with other EHRs, provided the same tasks are

performed. In short, this testing serves as both a means to record or benchmark

current usability, but also to identify areas where improvements must be made.

During the usability test, participants interacted with one EHR. Each participant

used the system in their individual location, and was provided with the same

instructions. The system was evaluated for effectiveness, efficiency and

satisfaction as defined by measures collected and analyzed for each participant:

Path deviations

Participant’s verbalizations (comments)

Participant’s satisfaction ratings of the system

Additional information about the various measures can be found in the section on

Usability Metrics.

TASKS A number of tasks were constructed that would be realistic and representative of

the kinds of activities a user might perform with this EHR, including:

1. Enable Drug Allergy check under Order Configuration (Drug Allery)

2. Enable Drug Interaction check under Order Configuration and enable Severe, Moderate and Mild drug interactions (Drug Interaction)

3. Create Formulary management for pharmacy, activate formulary and add items/medication to formulary (Formulary Management)

4. Enter and complete/ePrescribe an order with medication to which patient has allergy and Moderate Drug Interaction (Drug Allergy, Drug

Page 41: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

7 | P a g e

interaction, Electronic prescription)

5. Remove Moderate and Mild drug interaction check from configuration (Drug Interaction)

6. Enter and complete/ePrescribe an order with medication to which patient has Minor Drug Interaction (Drug interaction, Electronic prescription)

Tasks were selected based on their frequency of use, criticality of function, and

those that might be most troublesome for users.

PROCEDURES

As users were located across the country, the test was performed through an

online meeting tool where the users and the test administrators could both talk

and see each other's computer screens.

To ensure that the test ran smoothly, a total of 5 participants (one administrator

per participant) and 1 moderator joined the session. The usability testing staff

members who conducted the test were experienced clinical application product

and program managers.

The administrator moderated the session, including administering instructions and

tasks. The administrator also monitored task times, obtained post-task rating

data, and took notes on participant comments. A second person served as the data

logger and took notes on task success, path deviations and comments.

Participants were instructed to perform the tasks (see specific instructions below):

As quickly as possible making as few errors and deviations as possible,

Without assistance; administrators were allowed to give immaterial guidance and clarification on tasks, but not instructions on use,

Without using a “think aloud” technique.

For each task, the participants were given verbal directions about how to complete

the task.

Following the session, the administrator gave the participant the post-test

questionnaire (i.e., the System Usability Scale, see Appendix 4)

Participants' demographic information, task success rate, deviations, verbal

responses, and post-test questionnaire responses were recorded into a

spreadsheet.

Participants were thanked for their time.

Page 42: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

8 | P a g e

TEST LOCATION

Each study participant performed the testing at their employment site in a quiet

location. Communication with the participants was through an online meeting

(WebEx) so the participant and the test administrator were able to communicate

and view each other's screens.

TEST ENVIRONMENT

The EHR would be typically be used in a healthcare office or facility. In this

instance, the EHRUT was conducted in each tester’s place of employment. For

testing, the computer used was a desktop computer running a Windows operating

system. The participants used a mouse and keyboard when interacting with the

EHRUT. The PointClickCare application used the display that was standard at the

tester’s home office / facility. The application was set up by the tester’s employer

according to PointClickCare’s documentation describing the system set-up and

preparation. The application itself was running on a Windows platform using a

testing/training database. Technically, the system performance (i.e., response

time) was representative of what actual users would experience in a field

implementation. Additionally, participants were instructed not to alter any of the

default system settings (such as control of font size).

TEST FORMS AND TOOLS

During the usability test, various documents and instruments were used, including:

1. Moderator’s Guide

2. Post-Test Questionnaire

Examples of these documents can be found in Appendices 3-4 respectively. The

Moderator’s Guide was devised so as to be able to capture required data.

PARTICIPANT INSTRUCTIONS

The administrator read the following instructions aloud to each participant (also

see the full Moderator’s Guide in Appendix 3):

Thank you for participating in this study. Your input is very important. Our session

today will last about 60 minutes. During this time, you will use an instance of an

electronic health record.

I will ask you to complete a few tasks using this system and answer some questions.

You should complete the tasks as quickly as possible making as few errors as

possible. Please try to complete the tasks on your own following the instructions

Page 43: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

9 | P a g e

very closely. Please note that we are not testing you, we are testing the system;

therefore, if you have difficulty, all this means is that something needs to be

improved in the system. I will be here in case you need specific help, but I am not

able to instruct you or provide help in how to use the application.

Overall, we are interested in how easy (or how difficult) this system is to use, what

in it would be useful to you, and how we could improve it. I did not have any

involvement in its creation, so please be honest with your opinions. All of the

information that you provide will be kept confidential and your name will not be

associated with your comments at any time. Should you feel it necessary, you are

able to withdraw at any time during the testing.

Following the procedural instructions, participants were shown the EHR and as

their first task, were given time to explore the system and make comments

however as they we all familiar with the system this was declined. Once this task

was complete, the administrator gave the following instructions:

For each task, I will read the description to you and say “Begin.” At that

point, please perform the task and say “Done” once you believe you have

successfully completed the task. I would like to request that you not talk

aloud or verbalize until you are done

Participants were then given 7 tasks to complete. Tasks are listed in the Moderator’s guide in Appendix 3.

USABILITY METRICS

According to the NIST Guide to the Processes Approach for Improving the

Usability of Electronic Health Records, EHRs should support a process

that provides a high level of usability for all users. The goal is for users to

interact with the system effectively, efficiently, and with an acceptable

level of satisfaction. To this end, metrics for effectiveness, efficiency and

user satisfaction were captured during the usability testing.

The goals of the test were to assess:

1. Effectiveness of PointClickCare by measuring participant

success rates and errors

2. Efficiency of PointClickCare by measuring the average task time and path deviations

3. Satisfaction with PointClickCare by measuring ease of use ratings

DATA SCORING

Page 44: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

10 | P a g e

The following table details how tasks were scored and how errors were

evaluated:

Measures Rationale and Scoring Effectiveness:

Task Success

A task was counted as a “Success” if the participant was able to achieve the correct outcome, without assistance, within the time

allotted on a per task basis.

The total number of successes were calculated for each task and then divided by the total number of times that task was attempted. The results were provided as a percentage.

Effectiveness:

Task Failures

If the participant abandoned the task, did not reach the correct answer, performed it incorrectly, or reached the end of the allotted time before successful completion, the task was counted as a “Failure.” No task times were taken for errors.

The total number of errors was calculated for each task and then divided by the total number of times that task was attempted. Not all deviations would be counted as errors. This was expressed as the mean number of failed tasks per participant.

On a qualitative level, an enumeration of errors and error types should be collected.

Efficiency:

Task Deviations

The participant’s path (i.e., steps) through the application was recorded. Deviations occurred if the participant, for example, went to a wrong screen, clicked on an incorrect menu item, followed an incorrect link, or interacted incorrectly with an on-screen control. This path was compared to the optimal path. The number of steps in the observed path was divided by the number of optimal steps to provide a ratio of path deviation.

Satisfaction:

Task Rating

Each participant’s subjective impression of the ease of use of the application was measured by administering both a simple post-task question as well as a post-session questionnaire. After each task, the participant was asked to rate “Overall, this task was:” on a scale of 1 (Very Difficult) to 5 (Very Easy). These data were averaged across participants.

Common convention is that average ratings for systems judged easy to use should be 3.3 or above.

To measure participants’ confidence in, and likeability of, PointClickCare overall, the testing team administered the System Usability Scale (SUS) post-test questionnaire. Answer options included, “I think I would like to use this system frequently,” “I thought the system was easy to use,” and “I would imagine that most people would learn to use this system very quickly.” See full System Usability Score questionnaire in Appendix 4.

Page 45: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

11 | P a g e

RESULTS

DATA ANALYSIS AND REPORTING

The results of the usability test were calculated according to the

methods specified in the Usability Metrics section above. Participants

who failed to follow session and task instructions had their data

excluded from the analyses. There were no deletions.

The usability testing results for the EHRUT are detailed below.

The results from the SUS (System Usability Scale) scored the

subjective satisfaction with the system based on performance with

these tasks to be: 85.6 (This represent the mean score of 5 testers.

This calculation is based on the answers by testers to Usability Scale

Questionnaire. There are total of 10 questions with a scale from 1 to 5

[Maximum total of 50 points; 5 points per question. Both Maximum

total points and points provided by each individual were multiplied by

2 to get the result out of 100]) Broadly interpreted, scores under 60

represent systems with poor usability; scores over 80 are considered

to be above average.

Task N Task Success Task Deviation

Errors

Enable Drug Allergy check under Order Configuration

2 100% 0% 0

Enable Drug Interaction check under Order Configuration and enable Severe, Moderate and Mild drug interactions

2 100% 0% 0

Create Formulary management for pharmacy, activate formulary and add items/medication to formulary

2 100% 0% 0

Enter and complete/ePrescribe an order with medication to which patient has allergy and Moderate Drug Interaction

5 100% 0% 0

Remove Moderate and Minor drug interaction check from configuration

2 100% 0% 0

Enter and complete/ePrescribe an order with medication which can cause Minor Drug Interaction

5 100% 0% 0

Page 46: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

12 | P a g e

DISCUSSION OF THE FINDINGS

EFFECTIVENESS

It appears that the application is effective for these testers. It might merit further testing in the future with newer users or those not accustomed to the system, to determine how intuitive the user interface is.

EFFICIENCY

All testers performed each of the tasks rapidly and without errors. Therefore, it can be assumed that, for users who are acclimated to the system, they are able to perform tasks with high efficiency.

SATISFACTION

Overall, the testers are satisfied with the application and its functionality. They are users who are in the application frequently to manage the care of their residents. Two of the users, although not clinical, are responsible for supporting their clinical teams and function as the system administrator and, therefore, are not only knowledgeable about the application but also how many users work in it.

MAJOR FINDINGS

It’s very easy to enable the drug allergy, drug interaction and setting up formulary in the system.

AREAS FOR IMPROVEMENT

Orders needs to be signed multiple times.

This will be evaluated for a future release.

Page 47: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

13 | P a g e

APPENDICES

The following appendices include supplemental data for this usability

test report. Following is a list of the appendices provided:

1: Sample Recruiting Screener

2: Participant Demographics

3: Example Moderator’s Guide

4: System Usability Scale Questionnaire

Page 48: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

14 | P a g e

Appendix 1: SAMPLE RECRUITING SCREENER

1. Recruiting

We are recruiting individuals to participate in a usability study for an electronic health record. We would like to ask you a few questions to gather participation information. This should only take a few minutes of your time. This is strictly for research purposes. You will not be paid to participate in Training and Testing.

Please answer the following questions: 1. Are you male or female? Male Female 2. Have you participated in a focus group or usability test in the past 6 months? Yes No 3. Do you, or does anyone in your home, work in marketing research, usability research, web design, or other computer work? Yes No 4. Do you, or does anyone in your home, have a commercial or research interest in an electronic health record software or consulting company? Yes No 5. Which of the following best describes your age? 23 to 39 40 to 59 60 to 74 75 or older 6. Which of the following best describes your race or ethnic group? Caucasian Asian Black/African American Latino/a or Hispanic Other 7. Do you require any assistive technologies to use a computer? Yes No If Yes, Please Describe: Professional Demographics: 8. What is your current position and title? (Must be healthcare related)

_____RN: Specialty

_____Physician: Specialty

_____Resident: Specialty

_____Administrative Staff

_____Other Medical: Specialty 9. How long have you held this position?

Page 49: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

15 | P a g e

10. Describe your work location (or affiliation) and environment [e.g., private practice, health system, government clinic, etc.] 11. Which of the following describes your highest level of education?

High School/GED Some College College Graduate (RN/BSN) Postgraduate(MD/PhD) Other: Please Describe:

2. Recruiting

Computer Expertise: 12. Besides reading email, what professional activities do you do on the computer? Access EHR Research News Shopping/Banking Digital Pictures Programming Microsoft Office Products 13. About how many hours per week do you spend on the computer? 0 to 10 11 to 25 26 or More 14. What computer platform do you usually use? [e.g., Mac, Windows, etc.] 15. What Internet browser(s) do you usually use? [e.g. Firefox, IE, AOL, etc.] 16. In the last month, how often have you used an electronic health record? 17. How many years have you used electronic health records? 18. How many EHRs do you use or are you familiar with? 19. How does your work environment record/retrieve patient records?

On Paper Some Paper/Some Electronic All Electronic Contact Information: Name of participant:

Address:

City: State: Zip:

Daytime phone: Evening Phone: Cell phone:

Page 50: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

16 | P a g e

Email address:

Appendix 2: PARTICIPANT DEMOGRAPHICS

The report should contain a breakdown of the key participant demographics. A representative list is shown

below.

Following is a high-level overview of the participants in this study.

Gender

Men 1

Women 4

Total (participants) 5

Occupation/Role

RN/LVN 2 Physician 0

Administrative Staff 3

Total (participants) 5

Years of Experience Combined

Years of experience with Health Records 50+

Page 51: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

17 | P a g e

Appendix 3: MODERATOR’S GUIDE

Seven tasks are presented here for illustration.

EHRUT Usability Test Moderator’s Guide

Administrator: One administrator per participant. Meetali Acharya, Sumona Tailor, Ryan D’Mello, Brad Payne, Sumayya Javid

Data Logger: Meetali Acharya, Sumona Tailor, Ryan D’Mello, Brad Payne, Sumayya Javid

Date: 9/16/2016 Time: 11: AM EST

Participant # 3

Location: Remote

Prior to testing

☐ Confirm schedule with participant

☐ Ensure part icipant ’s environment is running properly

Prior to each participant: ☐ Reset application

☐ Start session recordings with tool

Prior to each task:

☐ Reset application to starting point for next task

Orientation (5 minutes)

Thank you for participating in this study. Our session today will last 60 minutes.

During that time, you will take a look at an electronic health record system.

I will ask you to complete a few tasks using this system and answer some questions. We

are interested in how easy (or how difficult) this system is to use, what in it would be

useful to you, and how we could improve it. You will be asked to complete these tasks

on your own, trying to do them as quickly as possible, with the fewest possible errors or

deviations. Do not do anything more than asked. If you get lost or have difficulty, I

Page 52: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

18 | P a g e

cannot help you with anything to do with the system itself. Please save your detailed

comments until the end of a task or the end of the session as a whole when we can

discuss freely. I did not have any involvement in its creation, so please be honest with

your opinions.

The product you will be using today is the production version of the PointClickCare application.

Do you have any questions or concerns?

Preliminary Questions (5 minutes)

What is your job title / appointment?

How long have you been working in this role?

What are some of your main responsibilities?

Tell me about your experience with electronic health records.

Page 53: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

19 | P a g e

Task 1: First Impressions (120 Seconds)

This is the application you will be working with. Have you heard of it? If so, tell me what you know about it.

☒ Yes ☐No

Show test participant the EHRUT.

Please don’t click on anything just yet. What do you notice? What are you able to do here? Please be specific.

Administrator / Note Taker Comments: All five of the testers were familiar with the application and had either trained on it at an earlier time or used it on a daily basis.

Page 54: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

20 | P a g e

Task 2: Enable Drug Allergy (180 seconds)

Drug allergy is not enabled by default in the application. Administrative user has access to configuration from where Drug allergy can be enabled. User can follow Clinical>>setup>>Order Configuration to enable this feature.

Success: ☒ Easily completed

☐ Completed with difficulty or help: Describe below

☐ Not completed Comments:

Task Time: ____120____ Seconds Observed Errors and Verbalizations: Comments: No Errors Rating: Overall, this task was: ___1___ Show participant written scale: “Very Easy” (1) to “Very Difficult” (5) Administrator / Note Taker Comments:

Testers said this task was easy and something with which they were familiar.

Saving data took more time than expected. Participants informed that it’s due to their slow network connections.

They felt that it was easy for them because they had been trained in the system.

Page 55: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

21 | P a g e

Task 3: Enable Drug Interaction check under Order Configuration and enable Severe, Moderate and Mild drug interactions (180 seconds)

Drug Interaction is not enabled by default in application. Administrative user has access to configuration from where Drug Interaction can be enabled. User can follow Clinical>>setup>>Order Configuration to enable this feature.

Success: ☐ Easily completed

☐ Completed with difficulty or help: Describe below

☐ Not completed Comments:

Task Time: _____120___ Seconds Observed Errors and Verbalizations:

Comments: No errors

Rating: Overall, this task was: __1____ Show participant written scale: “Very Easy” (1) to “Very Difficult” (5)

Administrator / Note Taker Comments:

Testers said this task was easy and something with which they were familiar.

Saving data took more time than expected. Participants informed that it’s due to their slow network connections.

They felt that it was easy for them because they had been trained in the system.

Page 56: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

22 | P a g e

Task 4: Create Formulary management for pharmacy, activate formulary and add items/medication to formulary (240 seconds)

User can set up the formulary by navigating to Clinical>>Setup>>Formulary management and can create new formulary as Pharmacy type. Once formulary is created, user can activate the formulary and add items/medication to the formulary.

Success:

☒ Easily completed

☐ Completed with difficulty or help: Describe below

☐ Not completed Comments:

Task Time: _____140___ Seconds Observed Errors and Verbalizations:

Comments: No errors

Rating: Overall, this task was: __1____ Show participant written scale: “Very Easy” (1) to “Very Difficult” (5)

Administrator / Note Taker Comments:

Testers said this task was easy and something with which they were familiar.

They felt that it was easy for them because they had been trained in the system.

Page 57: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

23 | P a g e

Task 5: Enter and complete an order with medication to which patient has allergy and Moderate Drug Interaction (180 seconds)

User can create an order by navigating to patient profile>>orders and clicking new button. User can search medication library or select formulary. User can add medication to which patient has allergy and moderate drug interaction.

Success: ☒ Easily completed

☐ Completed with difficulty or help: Describe below

☐ Not completed Comments:

Task Time: _____120___ Seconds Observed Errors and Verbalizations:

Comments: No errors

Rating: Overall, this task was: __2____ Show participant written scale: “Very Easy” (1) to “Very Difficult” (5)

Administrator / Note Taker Comments:

Testers said this task was easy and something with which they were familiar.

They felt that it was easy for them because they had been trained in the system.

Testers said that they have to sign the order multiple times.

Page 58: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

24 | P a g e

Task 6: Remove Moderate and Minor drug interaction check from configuration (180 seconds)

User can follow Clinical>>setup>>Order Configuration to remove Moderate and Mild drug interactions.

Success: ☒ Easily completed

☐ Completed with difficulty or help: Describe below

☐ Not completed Comments:

Task Time: _____120___ Seconds Observed Errors and Verbalizations:

Comments: No errors

Rating: Overall, this task was: __1____ Show participant written scale: “Very Easy” (1) to “Very Difficult” (5)

Administrator / Note Taker Comments:

Testers said this task was easy and something with which they were familiar.

Saving data took more time than expected. Participants informed that it’s due to their slow network connections.

They felt that it was easy for them because they had been trained in the system.

Page 59: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

25 | P a g e

Task 7: Enter and complete an order with medication to which can cause Minor Drug Interaction (180

seconds)

User can create an order by navigating to patient profile>>orders and clicking new button. User can add medication to which can cause minor interaction.

Success: ☒ Easily completed

☐ Completed with difficulty or help: Describe below

☐ Not completed Comments:

Task Time: _____120___ Seconds Observed Errors and Verbalizations:

Comments: No errors

Rating: Overall, this task was: __2____ Show participant written scale: “Very Easy” (1) to “Very Difficult” (5)

Administrator / Note Taker Comments:

Testers said this task was easy and something with which they were familiar.

They felt that it was easy for them because they had been trained in the system.

Testers said they have to sign the order multiple times.

Page 60: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

26 | P a g e

Final Questions (10 Minutes)

What was your overall impression of this system? Easy to use What aspects of the system did you like most? What aspects of the system did you like least? -Treating warnings, alerts, etc… as discrete events that require individual attention makes

completing orders more complicated than expected / necessary

- I would expect a single signature to suffice for all reviewed items

Were there any features that you were surprised to see? If yes, does these features impact the application usability in positive way?

None

What features did you expect to encounter but did not see? That is, is there anything that is missing in this application? No

Compare this system to other systems you have used. Would you recommend this system to your colleagues?

No comparisons. Yes, definitely recommend

Page 61: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

27 | P a g e

Appendix 4: SYSTEM USABILITY SCALE QUESTIONNAIRE

In 1996, Brooke published a “low-cost usability scale that can be used for global assessments of systems usability” known as the System Usability Scale or SUS.16

Lewis and Sauro (2009) and others have

elaborated on the SUS over the years. Computation of the SUS score can be found in Brooke’s paper, in at http://www.usabilitynet.org/trump/documents/Suschapt.doc or in Tullis and Albert (2008).

Tester 1

1. I think that I would like to use this system frequently

2. I found the system unnecessarily

complex

3. I thought the system was easy to use

4. I think that I would need the support of a technical person to be able to use this system

5. I found the various functions in

this system were well integrated

6. I thought there was too

much inconsistency in this system

7. I would imagine that most people would learn to use this system very quickly

8. I found the system very cumbersome to use

9. I felt very confident using the

system

10. I needed to learn a lot of things before I could get going with this system

Strongly Strongly disagree agree

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

Page 62: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

28 | P a g e

Tester 2

1. I think that I would like to use this system frequently

2. I found the system unnecessarily

complex

3. I thought the system was easy to use

4. I think that I would need the support of a technical person to be able to use this system

5. I found the various functions in

this system were well integrated

6. I thought there was too

much inconsistency in this system

7. I would imagine that most people would learn to use this system very quickly

8. I found the system very cumbersome to use

9. I felt very confident using the

system

10. I needed to learn a lot of things before I could get going

with this system

Strongly Strongly disagree agree

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

Page 63: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

29 | P a g e

Tester 3

1. I think that I would like to use this system frequently

2. I found the system unnecessarily

complex

3. I thought the system was easy to use

4. I think that I would need the support of a technical person to be able to use this system

5. I found the various functions in

this system were well integrated

6. I thought there was too

much inconsistency in this system

7. I would imagine that most people would learn to use this system very quickly

8. I found the system very cumbersome to use

9. I felt very confident using the

system

10. I needed to learn a lot of things before I could get going with this system

Strongly Strongly disagree agree

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

Page 64: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

30 | P a g e

Tester 4

1. I think that I would like to use this system frequently

2. I found the system unnecessarily

complex

3. I thought the system was easy to use

4. I think that I would need the support of a technical person to be able to use this system

5. I found the various functions in

this system were well integrated

6. I thought there was too

much inconsistency in this system

7. I would imagine that most people would learn to use this system very quickly

8. I found the system very cumbersome to use

9. I felt very confident using the

system

10. I needed to learn a lot of things before I could get going with this system

Strongly Strongly disagree agree

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

Page 65: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able

31 | P a g e

Tester 5

1. I think that I would like to use this system frequently

2. I found the system unnecessarily

complex

3. I thought the system was easy to use

4. I think that I would need the support of a technical person to be able to use this system

5. I found the various functions in

this system were well integrated

6. I thought there was too

much inconsistency in this system

7. I would imagine that most people would learn to use this system very quickly

8. I found the system very cumbersome to use

9. I felt very confident using the

system

10. I needed to learn a lot of things before I could get going with this system

Strongly Strongly disagree agree

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4

Page 66: 1 - Drummond Group LLC · 2019-09-14 · location. Communication with the participants was through an online meeting (WebEx) so the participant and the test administrator were able