An Overview of Usability Evaluation #15 1. Outline What is usability evaluation ? Why perform...

31
An Overview of Usability Evaluation #15 1

Transcript of An Overview of Usability Evaluation #15 1. Outline What is usability evaluation ? Why perform...

1

An Overview of Usability Evaluation

#15

2

Outline

What is usability evaluation ?

Why perform usability evaluation ?

Types of usability evaluations

What can we learn from: Heuristic evaluation ? Usability testing ?

How do we conduct : Heuristic evaluation ? Usability test ?

3

What Is Usability Evaluation

A systematic process aimed at assessing the fit between a UI design and human capabilities within a task context

It is a central element of the UI design process performed throughout the UI development process

4

Why Perform Usability Evaluations?

Find usability problems in an interface design

Assess compliance with style guide ( e.g., MS windows )

Compare alternative UI components Icon design Input/output technologies

Assess the worth/usefulness of the software in overall job context

5

Evaluation Methods

Evaluation Categories Evaluation Requirements

Evaluation Technique

Usability inspection methods

A static prototype A UI design expert

Heuristic evaluationEvaluation against guidelinesCognitive walkthrough

User-based Evaluations A dynamic prototype A usability analyst

QuestionnairesObservational usability studyFormal usability study with quantitative data analysisControlled experiments

Analytic Evaluations A UI designer with expiries in analytic techniques

Keystroke level modelGOMSGrammars

6

Questions of a Usability Evaluation (1)

Are the functions made available in a convenient, task oriented way? we need to have task knowledge

Does the system anticipate the skill and knowledge of the user? we need to have user knowledge

Does the design meet general rules of good user interface design? we need to have UI knowledge

7

Questions of a Usability Evaluation (2)

Is the system compliant with other applications running in the user environment? We need to know the style guide knowledge

How fast can users learn to use the system?

At what speed can users perform various task?

How likely are users to complete a given task?

8

Usability Inspection Methods

Evaluation against guidelines

Heuristics evaluation

9

Evaluation Against Guidelines and Rules

A process in which each UI element (e.g., menu choice, icon, button, pointer, radio button) is examined against an existing set of general guidelines and a specific set of design rules (the style guide) applicable to a specific product Mil STD 1476 F Windows style guide

Performed by one or more UI design experts who have a thorough familiarity with general UI design guidelines and the product/corporate style guide

10

Guidelines and Rules

Guidelines are accepted principles for interface design

Rules specify the interface appearance or action

11

Examples of Guidelines

Guidelines Displays should be consistently formatted Displays should be uniquely identified Use short simple sentences Employ units of measurement that are familiar to the user

12

Examples of Design Rules

Design Rules The character stroke width of a system font shall be at least 2 pixels

thick F10 (and Shift+Menu) exits the menu bar and returns the location cursor

to the previous object with focus

13

Pro and Cons of Evaluating Against Guidelines

Pros Provides information on basic design issues Finds a broad range of usability problems

Cons Dose not assess whether system meets user's and task

needs ( can be compliant and still have poor design) Time consuming Guidelines/rules don't exist for all areas of UI design

14

Heuristic Evaluation

Popular and widely used structured review of a UI

Objective is to generate a list of potential usability problems

Evaluator assumes the user's role and identifies problems from a user's perspective

Criteria for "a problem" is a set of recognized usability principles called "Heuristics"

15

Heuristics Identified by Nielsen (1993)

Use simple and natural dialogue

Speak the users' language

Minimize the users' memory load "‘

Be consistent

Provide feedback

Provide clearly marked exit

Provide shortcuts

Provide good error messages

Prevent errors

16

Conducting a Heuristic Evaluation

Collect background information Identify typical users, scenarios, previous feedback,

usability goals

Inspect the flow of the interaction from screen to screen

Inspect screens one at a time against the heuristics

Generate an inspection problem report Lists and prioritize the usability issues, fixes and/or

redesigns

17

Who Should Inspect ?

Usability specialists often find more “problems” than evaluators with no usability experience (or computer experience only)

Usability specialists with knowledge about a particular kind of interface being developed (Double specialists) find more usability problems than “regular” usability specialists

How Many Inspectors ? Single evaluator finds only about 35% of the problems

Increasing the number of evaluators from 2 to 5 increases the number of problems found up to around 75% of all the problems

Percentof ProblemsFound

5Number of inspectors .

Nielsen 1993

18

19

Types of Problems Uncovered by Heuristic Evaluation

Missing or difficult-to-access functionality

Limited or inappropriate task flow

Limited navigational cues

Inappropriate feedback

Cluttered screens

20

Pros and Cons of Heuristic Evaluation

Pros Does not involve users, Relatively inexpensive Finds a broad range of major and minor usability problems Maximized by using multiple evaluators Less intimidating to developers then usability testing

Cons Subjective and dependent on HCI skills of the evaluators and

their knowledge of the task and the users Depends on how realistically and to what degree the system is

exercised Not exhaustive

21

Usability Testing

22

Usability Testing

A set of methods of user-based evaluations Questionnaires Observational usability studies Formal usability studies with quantitative data analysis Controlled experiments

Observe and measure how users interact with an application

Focus on the direct feedback from end users interacting with the system

Should be the ultimate goal of every evaluation plan because it involves real end users

23

The Nature of Usability Testing

Merges several user-based evaluation methods into a single evaluation process Observation Interviews Testing

Each method illuminates a different aspect of usability

Performed after a design (or parts of a design) have been fine tuned based on usability inspection techniques

Performed before a prototype is handed over to developers and a product is sent out

24

When is Usability Testing Useful

Test early to: Evaluate an individual aspect of the design Significantly affect the design Provide quick answers to developer

may involve fewer users may collect less data

Test late to: Verify the entire application

Stable design Full functionality

Asses the impact of the design on the user Controlled variables

25

Where Is Usability Testing Performed

User office environment Users are in their natural surroundings Easier to recruit users But -- Uncontrolled environmental setting Interruptions Variety of computer configuration No observation by development team permitted

Usability Lab Controlled setting Consistent computer configuration Data collection equipment Permits unobtrusive observation by development team

26

Who Participates in Usability Testing ?

Evaluators Usability specialists

Participants Potential users

Observers Members of the design/development team

27

How Many Participants to Include ?

At least 2 from each distinct user group

2 - 3 at earlier stages of the evaluation when focus is on gross usability issues

6 and up (per user group) at later stages of the evaluation when focus is on performance assessment

Remember, the objective in usability evaluation is not to uncover statistical differences, only design issues

28

Measures of Usability

Time to complete task

Completion rate

Number of errors

Types of errors

Severity of errors

Number of requests for help

Number of trials to become proficient in using the system

Comparative ratings

Subjective ratings

29

What to Expect From Test Participants

Do the unexpected

Have preconceived ideas

Do not always ask for help

Fail to follow instruction

Quickly develop habits

Are afraid of breaking system

Are apologetic

30

Data Collection Techniques

Video Taping User’s interactions with the application User’s facial expressions

Audio Taping User comments Observer comments

Data collection applications Keystroke capture Indexed videotape

Questionnaires

Interviews Open ended Structured

31

Summary

Start evaluation early in the design process and continue to evaluate throughout the development cycle This will minimize the likelihood of a major usability

problem emerging during the later phases of the development

Incorporate a variety of evaluation methods One method cannot predict or identify all the potential

usability issues

Include at least one user-based evaluation method in your evaluation plan