Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading...

42
Heuristic Evaluation HCC 729, 2/13/14

Transcript of Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading...

Page 1: Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.

Heuristic Evaluation

HCC 729, 2/13/14

Page 2: Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.

We’ll follow up next time

• Inspirations, reading feedback• Your HTAs and personas

Page 3: Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.

How to conduct a Heuristic Evaluation

Read this: http://www.nngroup.com/articles/how-to-conduct-a-heuristic-evaluation/

Adapted from slides by Karen Tang and Ryan Baker

Page 4: Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.

What is an evaluation?

• Gather data about the usability of a product or design by a particular group of users for a particular activity or task within a particular environment or context

• Evaluation goals:• Assess extent of system’s functionality • Assess effect of interface on user • Identify specific problems with system

Page 5: Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.

HE vs. user testing

• When we can, we want to test with real users

• HE is a “discount” usability technique

• When it’s useful:– When real users are unavailable– Very early in the design– As a sanity check (but not a replacement for user

testing)

Page 6: Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.

Why HE is great

• Cheap • Doesn’t “spend” users

• Fast • 1-2 days (instead of 1 week)

• Good • Proven effective: the more careful you are, the

better it gets• Easy to use

• Relatively easy to learn, can be taught

Page 7: Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.

Heuristic Evaluation

• A type of discount usability testing • A rational method – an expert applies “heuristics”

• mentally apply a theory or rule to the design and see if that theory/rule’s advice is being followed

• Key Idea: Multiple expert evaluators independently apply a set of heuristics to an interface, produce Usability Action Reports (UARs), combine & prioritize their findings

Page 8: Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.

What can you evaluate with a HE?

• Any interface that has been “developed”• Pre-existing webpage• Sketch of a future interface (can be fully

implemented or only exist as a sketch)

• This method can be applied on your own interface, or a competitor’s• You will evaluate the interface according to a

standard set of 10 heuristics

Page 9: Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.

How many evaluators are needed?

• Nielsen recommends at least 3, but go for 5!

Page 10: Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.

Who should do the HE?

• Anyone who knows the appropriate heuristics can do a HE

• But, heuristic evaluation experts find almost twice as many problems as novices

• Heuristic evaluation experts who are also domain experts find almost three times as many problems as novices

Page 11: Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.

Phases of Heuristic Evaluation

0) Pre-evaluation training (optional)Give evaluators needed domain knowledge & information on the scenario

1) Evaluate the interface to find usability problems

2) Record the problem 3) Aggregate problems 4) Assign severity rating5) Find a solution complexity rating

Page 12: Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.

#1: evaluate the interface

Page 13: Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.

Which heuristics to use?

• Many possible heuristic sets• Some standard sets (e.g. Nielsen’s usability

heuristics)• You might create your own heuristics, e.g. for

specific applications

• We’ll focus on Nielsen’s, which cover a range of general usability issues

Page 14: Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.

Find which heuristic is violated1. Simple & Natural Dialog2. Speak User’s Language3. Minimize User’s Memory Load4. Consistency 5. Feedback 6. Clearly Marked Exits7. Shortcuts 8. Good Error Messages 9. Prevent Errors 10. Help & Documentation

Nielsen’s 10 Heuristics

http://www.nngroup.com/articles/ten-usability-heuristics/

Page 16: Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.

#2: record the problem

Page 17: Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.

Record the problem

• Each evaluator writes a Usability Action Report (UAR) describing each usability problem they encounter • HEs are typically used to report problems• However, UARs can be used to report both the

good and bad qualities of an interface in other usability evaluations…

• I have posted a template UAR on the blog with the assignment

Page 18: Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.

Sample UAR• EVALUATOR: XXXXX• ID NUMBER: XXX• NAME: Descriptive name for the problem• EVIDENCE: Describe the violation, and why

you wrote this report. what heuristic was violated,

• EXPLANATION: Your interpretation: and why.• Severity: Write up at the end of the evaluation• Fixability: Write up at the end of the evaluation• Possible Fix: Write up at the end of the evaluation

Page 19: Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.

Keep looking for problems!• Usually takes a few hours• A shorter time may not

find important problems• A longer time will exhaust

the evaluator, and they may become less productive

• For very large interfaces, it is good to break heuristic evaluation into several sessions

Page 20: Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.

What about multiple problems?

• This happens a lot, record them separately.

• This is not busywork…. • It may be possible to fix some of the problems,

but not all of them• The problems might not always be linked to each

other – one may show up in other situations too

Page 21: Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.

You are not done yet…

• You still need to address the bottom half of the UAR:• Severity• Solution Complexity• Possible Fix

• You may want to take a break before finishing these UARs…

Page 22: Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.

#3 aggregate the problems

Page 23: Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.

Aggregate Problems

• Wait until all UARs are in• You are aggregating across all evaluators

• Aggregating usability problems:• Combine problems by consensus• Gain a sense of relative importance after

you’ve seen a few problems• At this point, decide which entries are and

aren’t problems (but keep original version of report somewhere)

Page 24: Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.

#4: assign each problem a severity rating

Page 25: Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.

Assign Severity Rating to UARs

• Severity Ratings help project leads determine what problems should be given more developer time• Not all problems can be fixed • Some problems will have more severe

consequences

• Each evaluator should assign severity separately

Page 26: Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.

Assign Severity Rating to UARs

Based on a combination of: • Frequency

• How common or rare is the problem?• Impact

• How easy is it to overcome the problem? • How disastrous might the problem be?

• Persistence • How repeatedly will users experience the problem? • Are workarounds learnable?

Page 27: Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.

Assign an Overall Severity Rating

• It is helpful to developers in allocating resources to have one severity rating for the problem.

• Therefore, evaluators need to combine their opinion of a problem’s Frequency, Impact, & Persistence ratings into one Severity evaluation

Page 28: Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.

Nielsen’s Severity Ratings1. Usability Blemish. Mild annoyance or cosmetic

problem. Easily avoidable.2. Minor usability problem. Annoying ,misleading,

unclear, confusing. Can be avoided or easily learned. May occur only once.

3. Major usability problem. Prevents users from completing tasks. Highly confusing or unclear. Difficult to avoid. Likely to occur more than once.

4. Critical usability problem. Users won’t be able to accomplish their goals, and may quit using system.

Page 29: Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.

False positives

• There’s no virtue in finding 6,233 problems, if very few of them actually cause problems for a user

• Every problem reported in a heuristic evaluation takes time for the developers to consider

• Some interface aspects that seem like problems at first might not be problems at all

Page 30: Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.

#5 Assign each solution a complexity rating

Page 31: Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.

5: Solution Complexity Rating

• Some problems take more time to fix than others, so it’s important to allocate developers’ time well

• Ideally this could be made by either a developer, or someone who is familiar with development in the target platform

Page 32: Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.

Solution Complexity Rating

1. Trivial to fix. Textual changes and cosmetic changes. Minor code tweaking.

2. Easy to fix. Minimal redesign and straightforward code changes. Solution known and understood.

3. Difficult to fix. Redesign and re-engineering required. Significant code changes. Solution identifiable but details not fully understood.

4. Nearly impossible to fix. Requires massive re- engineering or use of new technology. Solution not known or understood at all.

Page 33: Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.

Record Possible Fixes

• While evaluating solution complexity evaluator may have thought about how the problem could be fixed

• Record these possible fixes as suggestions to developers• Don’t focus on feasibility of solutions (that is

their job) • Your suggestions may be thought-provoking

Page 34: Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.

Phases of Heuristic Evaluation

0) Pre-evaluation training (optional)Give evaluators needed domain knowledge & information on the scenario

1) Evaluate the interface to find usability problems

2) Record the problem 3) Aggregate problems 4) Severity rating5) Solution complexity rating

Page 35: Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.

Why HE?

• They find a reasonably large set of problems• They are one of the easiest, quickest, and

cheapest methods available

Page 36: Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.

HE vs. User Testing

• User tests are more effective at revealing when a system’s manifest model or metaphor is confusing

• User tests are less effective at finding obscure problems

• User tests are also much more expensive

• Advice: use HE first, to find the obvious problems, then user test.

Page 37: Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.

For next week

AssignmentReadings

Page 38: Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.

In-class assignment

Perform HE on UMBC class search with PeopleSoft

Use template form from the blog

Page 39: Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.
Page 40: Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.

HW: Perform HE on your sites

• Go through the 5 stages of HE for your website– If on a team, each go through HE individually, then

combine later• Turn in 1 completed form for each incident • Come up with at least 8 UARs for each webpage• Aggregate, finish filling out the template!• Everyone writes 100-200 words describing what

they learned

Page 41: Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.

HW Extra credit

• Perform a HE on another student’s web site (same as previous slide)

Page 42: Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.

Reading

• Required• Usability Engineering Chapter 6 (focus on 6.1-

6.6 and 6.8)• Optional

• Think aloud method