HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015 Conferences Other usability...

63
HCI460: Week 2 Lecture HCI460: Week 2 Lecture September 16, 2009 September 16, 2009

Transcript of HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015 Conferences Other usability...

Page 1: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

HCI460: Week 2 LectureHCI460: Week 2 LectureSeptember 16, 2009September 16, 2009

Page 2: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

2© April 19, 2023

Conferences

Other usability inspection methods:– Pluralistic walkthrough– Competitive evaluation

Effectiveness of usability inspection methods

How to write good findings

How to write good recommendations

Project 1b

Outline

Page 3: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

3© April 19, 2023

Conferences

Page 4: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

4© April 19, 2023

Mobile HCI (Human-Computer Interaction)– Focus on mobile systems: wearable computing, mobile phones,

in-car interfaces etc.– Late summer, usually in Europe– 2009 in Bonn (11th)

UPA (Usability Professionals’ Association)– Every Summer– Very practical (mostly attended by practitioners), more qualitative– 2009 in Portland (13th); 2010 in Munich

HFES (Human Factors and Ergonomics Society)– Every Fall– Quantitative, much more academic than UPA– 2009 in San Antonio (53rd)

Mobile HCI, UPA, and HFESConferences

Page 5: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

5© April 19, 2023

HCII (Human-Computer Interaction International)– Every other Summer– Very large conference; many different topics– 2007 in Beijing; 2009 in San Diego (13th); 2011 in Orlando

CHI (Computer-Human Interaction; ACM SIG)– Every Spring– 2009 in Boston (27th); 2010 in Atlanta– Used to be more quantitative but now open to other submission

types

HCII and CHIConferences

Page 6: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

6© April 19, 2023

Other Usability Inspection Methods

Page 7: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

7© April 19, 2023

It involves three types of participants:– Representative users– Product designers and developers– Usability professionals

It is used with paper prototypes.

Participants are given tasks and as many paper prototypes as there are tasks.

All participants are asked to assume the role of the user.

Participants mark on their hardcopies the action they would take in pursuing the task.

Developers also serve as “living publications.” If participants need information they would look for in the manual, they can ask aloud.

Pluralistic Walkthrough: OverviewOther Usability Inspection Methods

Page 8: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

8© April 19, 2023

The walkthrough moderator provides the task.

Participants are asked to mark on their hard copy of the first screen the action(s) they would take in attempting the task.

After everyone has written their independent responses, the walkthrough moderator announces the “right answer.”

Users verbalize their responses first and discuss potential usability problems.

Product developers explain why the design is the way it is.

Usability experts facilitate the discussion and help come up with solutions.

Participants can be given a usability questionnaire after each task and at the end of the day.

Pluralistic Walkthrough: ProcessOther Usability Inspection Methods

Page 9: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

9© April 19, 2023

Walkthrough must progress as slowly as the slowest participant (because we must wait for everybody before the discussion).– Participants may not get a good feel for the flow.

Multiple correct paths cannot be simulated. Only one path can be evaluated at a time.– This precludes participants from exploring, which might result in

learning.– Participants who picked a correct path that was not selected for

the walkthrough must “reset.”

Participants who performed a “wrong” action must mentally reset as well.

Product designers and developers have to be thick-skinned and treat users’ comments with respect.

Pluralistic Walkthrough: LimitationsOther Usability Inspection Methods

Page 10: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

10© April 19, 2023

It is a cost- and time-effective method.

It provides early performance and satisfaction data from users.

It often offers redesign on the fly.

“I got it right but…”– Participant responses that were correct but made with

uncertainty can be discussed.

It increases developers’ sensitivity to users’ concerns, which leads to increased buy-in.

Pluralistic Walkthrough: BenefitsOther Usability Inspection Methods

Page 11: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

11© April 19, 2023

It involves evaluating 2 or more products with similar functionality. For example:– “Us” vs. a key competitor / key competitors– Only key competitors

Goals can vary. For example:– Find usability issues in our product and tell us if there is anything

that the competitors are doing better.• We don’t care about the usability issues of the other

products.– Compare approaches and make recommendations for our new

product.• We want to know best practices and things we should avoid.

Competitive EvaluationOther Usability Inspection Methods

Page 12: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

12© April 19, 2023

Client’s request:– We are redesigning the way

people obtain pet insurance quotes and enroll on our site. Help us understand what our competitors are doing.

– 4 competitors: A, B, C, and D• E.g., Pets Best

Approach:

Competitive Evaluation: Pet Insurance ExampleOther Usability Inspection Methods

Page 13: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

13© April 19, 2023

Competitive Evaluation: Pet Insurance ExampleOther Usability Inspection Methods

Determine scope:– Quote– Enrollment

Review the sites and define aspects for comparison:– Access to the quote path– Process flow– Plan education within the quote path

Page 14: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

Competitive Evaluation: Pet Insurance Example

14© April 19, 2023 – Proprietary and Confidential

Other Usability Inspection Methods

Aspects A B C D New

Access to the Quote Path

Visibility on the Homepage

Accessibility via Secondary Pages

Approachability

Number of Quote Paths

Quote Retrieval – Method

Quote Retrieval – Required Information

Process Flow

Perceived Length of Quote Process

Perceived Length of Enrollment Process

Plan Education Within the Quote Path

Plan Information Accessibility from Quote

Ability to Compare Plans in Quote Path

Recommended Plan

Page 15: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

Competitive Evaluation: Pet Insurance Example

15© April 19, 2023 – Proprietary and Confidential

Other Usability Inspection Methods

Aspects A B C D New

Access to the Quote Path

Visibility on the Homepage

Accessibility via Secondary Pages

Approachability

Number of Quote Paths

Quote Retrieval – Method

Quote Retrieval – Required Information NA NA

Process Flow

Perceived Length of Quote Process

Perceived Length of Enrollment Process

Plan Education Within the Quote Path

Plan Information Accessibility from Quote

Ability to Compare Plans in Quote Path

Recommended Plan

Page 16: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

Other Usability Inspection Methods

Each cell in the table indicates a page, rather than a step. There may be multiple steps within a page.

Competitive Evaluation: Pet Insurance Example Process flow:

Page 17: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

17© April 19, 2023

Effectiveness of Usability Inspection Methods

Page 18: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

18© April 19, 2023

Inspection methods (expert/heuristic evaluation, cognitive walkthrough) vs. empirical methods (user testing - UT)

Inspection methods only find 30 – 50% of issues that are found in UT.

– Inspection methods are not a substitute for testing.

Heuristic evaluation findings are better predictors for UT findings than cognitive walkthrough findings.

Inspection methods will reveal more minor problems than UT.– E.g., inconsistent typography.

Inspection Methods vs. Empirical MethodsEffectiveness of Usability Evaluation Methods

Findings from a usability test

Findings from an

evaluation

Page 19: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

19© April 19, 2023

4 – 5 evaluators find ~80% of all problems that an inspection can find.– 1 evaluator is better than none.– 2 are better than 1.– Etc.

More evaluators are better than more time.– It is better to have two evaluators for 10 hours each than one

evaluator for 20 hours.

Number of EvaluatorsEffectiveness of Usability Evaluation Methods

Page 20: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

20© April 19, 2023

Rolf Molich’s CUE studies (1998 – 2009)– CUE = Comparative Usability Evaluation– 8 studies comparing outcome of usability

evaluations using different methods and multiple evaluators.

CUE 3 by Rolf Molich (2001)– 11 usability professionals independently evaluated avis.com using

inspection methods.– They found 220 problems in total (including 33 severe problems).– Each evaluator found 16% of the total number of problems...

• …and 24% of the severe problems– Average overlap between any 2 evaluators was only 9%.

Overlap Between EvaluatorsEffectiveness of Usability Evaluation Methods

Page 21: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

21© April 19, 2023

Evaluators with user testing experience are better at predicting issues than evaluators with no such experience.

Double experts (usability and domain) find 1.5 times more problems than single experts (usability only).– Pet insurance example:

Evaluators’ ExpertiseEffectiveness of Usability Evaluation Methods

Page 22: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

22© April 19, 2023

Prescribed tasks focus assessment on particular areas of the interface; other areas are often left unaddressed. – Preferred by evaluators.

Self-guided exploration will ensure broad coverage among evaluators.– FIV+?– 14 years old?

Possible solutions:– Use both.– Provide evaluators with typical

use cases but encourage self-guided exploration.

Prescribed Tasks vs. Self-Guided ExplorationEffectiveness of Usability Evaluation Methods

(1) Enroll Bubba, Male Domestic Short Hair Cat, 6 years, diabetes, zip code 10019.

(2) Enroll multiple pets, zip code 94102:Ole, Male Chihuahua, 10 years, no previous conditions.Percy, Female Persian, 2 years and 2 months, crystals.

Page 23: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

23© April 19, 2023

How to Write Good Findings

Page 24: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

24© April 19, 2023

Write issues in a way that they cannot be misinterpreted and dismissed.

Each issue description should have 4 components:– Description of the problem (describe and show it)– Justification for why this is a problem– Description of the proposed solution– Justification for why the proposed solution is better

Do not describe problems in terms of their solutions, e.g.:– “Labels should be added to the icons.”

Problem vs. SolutionHow to Write Good Findings

Problem

Recommendation

Page 25: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

25© April 19, 2023

Separate the problem from the solution because:– Describing the problem makes everyone focus on users.– Describing the problem facilitates providing some justification for

why this is a problem.– If the solution proposed is not feasible, the developers may be

able to use the problem description to find an alternative solution.

Problem vs. SolutionHow to Write Good Findings

Page 26: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

26© April 19, 2023

Can it impact efficiency?– Does it increase users’ mental workload and make them slower?– Does it introduce additional unnecessary steps?

Can it impact effectiveness?– Can it lead to errors?– Can it prevent users from being able to access features?

Can it impact user satisfaction?– Is it unpleasing?– Can it reduce credibility of the product?

Can it impact learnability?

Problem Justification: Why Is This a Problem?How to Write Good Findings

Page 27: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

27© April 19, 2023

CUE 6 by Rolf Molich (2006)– 13 professional usability teams independently evaluated

Enterprise.com:• 7 used usability testing• 3 used inspection methods• 3 used a combination

– Sample usability findings:

– Teams reported very few positive issues.

“Express Your Annoyance Tactfully” (Molich)How to Write Good Findings

Design looks clunky and unprofessional” and does not flow well. Does not look like an established car rental agency.

The fonts are very inconsistent throughout this page. For example, the “Looking for a quicker way to reserve?” text is smaller than the “Learn more” link. It seems as if the style sheets were blindly applied without any further editing and this only accentuates the unprofessionalism of this site.

The BBBOnline Reliability Program icon seems to just hang there – as if the designers didn’t know where else to stick it.

Page 28: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

28© April 19, 2023

Exercise: Critique & Improve These FindingsHow to Write Good Findings

1.) The icons have different actions assigned: Icon 1: Brings up a download progress pop-upIcons 2, 3, 4, and 5: Bring up new pages/viewsIcon 6: Brings up a menuIcon 7: Brings up a new window

6.) The Close button is very small.

7.) The window does not close when Alt-F4 is pressed.

2.) The icons are not labeled.

Page 29: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

29© April 19, 2023

Exercise: Critique & Improve These FindingsHow to Write Good Findings

15.) There is no highlighting or differentiation in the navigation panel between the active page and non-active pages.

8.) “Help” is only accessible through right clicking in the left sidebar and bottom border.

Page 30: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

30© April 19, 2023

Exercise: Critique & Improve These FindingsHow to Write Good Findings

25.) Users have to close the window to save the changes that they made to the Options.

Page 31: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

31© April 19, 2023

Exercise: Critique & Improve These FindingsHow to Write Good Findings

33.) The “City” field allows for the entry of all alpha-numeric values. Incorrect values produce “0 cities found.”

Page 32: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

32© April 19, 2023

Exercise: Critique & Improve These FindingsHow to Write Good Findings

50.) The word “retrieve” is redundant.

Page 33: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

33© April 19, 2023

Exercise: Critique & Improve These FindingsHow to Write Good Findings

52.) The control for “Both” is incorrect.

Page 34: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

34© April 19, 2023

How to Write Good Recommendations

Page 35: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

35© April 19, 2023

What do usability consultants offer stakeholders?

Goal of Usability EvaluationsHow to Write Good Recommendations

Discoveringusabilityissues

Developingrecommendationsto address theusability issues

50% 50%

Page 36: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

36© April 19, 2023

Writing Recommendations: Current Literature There is extant literature focusing on usability methods and

conducting usability tests.

Disproportionately less attention has been given to the generation of recommendations.

Recent works, like Molich, Jeffries, and Dumas’ (2007) paper, ‘Making usability recommendations useful and usable’, focus on providing guidelines for generating recommendations

However, to date, most literature and discussions have not provided specific recommendations on making recommendations, and for those that have, focus has primarily been on text-based recommendations.

How to Write Good Recommendations

Page 37: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

37© April 19, 2023

Severity ratings provide prioritization of usability issues.

The wording of recommendations should reflect and reinforce the severity ratings.

Getting Started: Setting the Tone

LOWHIGH MEDIUM

How to Write Good Recommendations

Page 38: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

38© April 19, 2023

Low severity issues: – “Consider” (e.g., Consider changing “Miscellaneous” to

“Accessories.”)– “May” (e.g., Removing button X may reduce user confusion

when completing task Y.)

High severity issues: – Don’t use directives (e.g., You must or It is imperative)– When appropriate: We strongly recommend…– For most recommendations, simply state the recommendation

without qualifiers. E.g.:• Change “Miscellaneous” to “Accessories.”• Remove button X to reduce user confusion when completing

task Y.

Medium severity issues: It’s a judgment call…

Getting Started: Setting the ToneHow to Write Good Recommendations

Page 39: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

39© April 19, 2023

Affordance Cognitive load Error Recovery Explorability/discoverability Fitt’s law Functional minimialism Mapping Mental model Salience Visual hierarchy / minimalism / weight / balance

Applying the science makes you sound educated, but providing actionable recommendations keeps you employed!

Usability BuzzwordsHow to Write Good Recommendations

Page 40: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

40© April 19, 2023

Actionable recommendations are usable to stakeholders/clients.

Usability findings tell stakeholders what’s wrong. Recommendations tell stakeholders how to fix the problem.

Actionable Recommendations

Recommendation Why problematic?

Provide white space to visually group the screen elements.

Inappropriate use of white space, either too little or too much, can render new usability problems.

Add visual weight to increase the area’s prominence.

Adding too much visual weight to one screen element can inappropriately shift the visual attention of other screen elements.

Make the button more salient.

There are multiple ways to make a button more salient (e.g., increase font size, increase the size of the button, change the button’s placement)

How to Write Good Recommendations

Page 41: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

41© April 19, 2023

Generic or vague recommendations can be worse than the omission of recommendations altogether.

Vague recommendations may not be actionable. As a result, the overall value of conducting the usability test has greatly diminished, despite key usability findings.

Implementing changes from vague recommendations can actually create more usability issues or cause a Web site or application to become less usable!

Examples:– Change the label to match user’s mental model.– Provide white space between the two fields.– Increase the visual weight of section X.

How Bad Can Bad Recommendations Be?How to Write Good Recommendations

Page 42: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

42© April 19, 2023

Group Exercise: 1 of 4

Bad Recommendation Actionable Recommendation

Increase the visual weight of the “Create JukeBox” link.

(Issue: Users did not see the “Create Jukebox” link)

How to Write Good Recommendations

Page 43: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

43© April 19, 2023

Group Exercise: 2 of 4

Bad Recommendation Actionable Recommendation

Make the toolbar more salient.

(Issue: The toolbar is easy to miss especially as users scroll down and back up to the top of the page)

How to Write Good Recommendations

Page 44: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

44© April 19, 2023

Group Exercise: 3 of 4

Bad Recommendation Actionable Recommendation

Improve the functional layering of the menu options.

(Issue: The “Order” and “Deliveries” menu options are the most frequently accessed options)

How to Write Good Recommendations

Page 45: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

45© April 19, 2023

Group Exercise: 4 of 4

Bad Recommendation Actionable Recommendation

Ensure consistent labeling across the application.

(Issue: The “Update Tax Type” link leads to a “Tax Type List” page.)

How to Write Good Recommendations

Page 46: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

46© April 19, 2023

Some actionable recommendations can be communicated effectively through text.

Effective when used to recommend changes to:– Labeling and terminology– Removal of screen elements– Messaging/descriptions– Information hierarchy/taxonomies

Text-Only RecommendationsHow to Write Good Recommendations

Page 47: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

47© April 19, 2023

Some actionable recommendations may not be effective communicated through text only.

Visual mockups can help illustrate recommendations when text recommendations simply cannot provide clear direction without becoming too complex.– E.g., Provide white space between section A and section B to

provide visual grouping of information.

Effective when used to recommend changes to:– Affordance– Visual weight/balance– Visual grouping– Mapping

When Text is Not EnoughHow to Write Good Recommendations

Page 48: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

48© April 19, 2023

Knowledge of color theory and use of expensive design tools (e.g., Adobe Photoshop) are not required to create mockups that illustrate recommendations.

Low-fidelity wireframes can illustrate workflows or process maps.

“Quick and dirty” wireframes can be quickly created using free or low-cost software.– SnagIt– FastStone (free alternative to SnagIt)– Gimp (free alternative to Photoshop)

Usability Evaluators ≠ Designers?How to Write Good Recommendations

Page 49: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

49© April 19, 2023

Example: www.frys.com

Creating ‘Quick and Dirty’ MockupsHow to Write Good Recommendations

Page 50: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

50© April 19, 2023

Example: www.frys.com

Creating ‘Quick and Dirty’ MockupsHow to Write Good Recommendations

Page 51: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

51© April 19, 2023

Mockups illustrate recommendations that may be difficult to describe with just words. – E.g., describing “appropriate levels of white space” is difficult,

and the recommendation can quickly become long winded.

Mockups help ensure multiple recommendations gel together.– Practitioners can ensure that multiple recommendations do not

conflict with one another, rather than leaving this for clients to discover later.

Mockups help ensure that implemented recommendations do not create additional usability problems. – For example, increasing a button’s size may shift the visual

attention from other screen elements.

Advantages of Mockups in RecommendationsHow to Write Good Recommendations

Page 52: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

52© April 19, 2023

Mockups have more visual impact. – Showing how elements can be visually grouped instead of

simply writing about visual grouping can drive change. It will also make your reports easier to present and more memorable.

Mockups aid the developers and coders who may be in charge of implementing change. – Developers and coders are oftentimes not usability experts.

Mockups reduce both ambiguity and creative license for developers and coders when implementing change.

Advantages of Mockups in RecommendationHow to Write Good Recommendations

Page 53: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

53© April 19, 2023

Mockups only serve to illustrate recommendations. – Regardless of whether clients have separate designers for a

product, mockups illustrate design that is driven by usability principles and best practices.

Mockups do not include graphic treatment. – Client expectations and interpretations of mockups need to be

clearly and appropriately set. Mockups visually illustrate recommendations and should not necessarily be adopted as is.

Even though mockups can be created quickly, they do require a bit more time than text. – Include this when planning a project so that there is ample time

to create a compelling report.

Limits and Precautions of Using MockupsHow to Write Good Recommendations

Page 54: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

54© April 19, 2023

When to Use Text-Only vs. Mockups?

Usability and Design Principles

Text-only

Mockup

Terminology and Labeling

Functional Minimalism

Cognitive Load

Feedback

Functional Layering

Visibility

Direct Manipulation

Mapping

Controls

Error Prevention

Fitt’s Law

Affordance

Usability and Design Principles

Text-only

Mockup

Hierarchy of Control

Grouping

Visual Hierarchy

Reading Order

Visual Weight

Visual Balance

Visual Scannability

Aesthetics

It depends!

How to Write Good Recommendations

Page 55: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

55© April 19, 2023

Great recommendations are not developed after a usability evaluation or testing.

Great recommendations are, in part, developed by having discussions with the stakeholder/client to understand:– Business needs and objectives

• Does the recommendation help achieve pre-defined success criteria (e.g., user registration, sales)?

– Technical constraints• Can the recommendation be implemented within the current

software architecture?– Implementation goals

• Does the recommendation modify existing screens (evolutionary change) or does it define new user interactions, workflows, and/or mental models (revolutionary change)?

What Makes Recommendations Great?How to Write Good Recommendations

Page 56: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

56© April 19, 2023

Depending on business needs, technical constraints, or implementation goals, recommendations may not be ideal in fully optimizing the user experience.

The goal of writing recommendations is to write actionable recommendations.

Provide your ideal recommendation along with alternative recommendations that account for the business needs, technical constraints, or implementation goals.

Providing More Than Ideal Recommendations

Recommendations

• Consider listing questions and answers that provide users information about the [site], the [parent company], and general information about [members].

− These questions can further legitimize the site for parents who may not be familiar with the [Organization].

• Alternatively, consider providing an About Us link within the utility navigation toolbar.

− The About Us page can contain information about the site and the [organization], while the FAQ page can focus specifically on [topic] questions.

How to Write Good Recommendations

Page 57: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

57© April 19, 2023

If stakeholders/clients discuss the possibility of expanding functionality or redesigning the application/Web site, offer two levels of recommendations:

– Short-term recommendation• Actionable recommendations that can be immediately

implemented without requiring a change to the software architecture or new code.

– Long-term recommendation• Actionable recommendations that would not be able to be

implemented given the current software architecture or code, but can be incorporated into a redesign or next application rollout.

Offering Tiered RecommendationsHow to Write Good Recommendations

Page 58: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

58© April 19, 2023

Addressing usability issues one at a time can be easier from a to-do list perspective.

A holistic view must be taken to ensure that all the recommendations, when implemented, address the usability issues identified during the evaluation. E.g.:– Increasing the size of five buttons will reduce the effect of

increasing each button’s prominence and visibility.

Mockups that incorporate all recommendations made to an application or Web site’s page can help ensure that recommendations don’t conflict or create additional usability issues. – Mockups also provide an excellent opportunity to show a before

and after of screens/pages to stakeholders.

Putting It All TogetherHow to Write Good Recommendations

Page 59: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

59© April 19, 2023

Project 1b

Page 60: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

60© April 19, 2023

Groups of 3 – 4 evaluators meet for a debriefing session.– Discuss each usability issue and why it is an issue.– Discuss positive findings / good practices.– Agree on the final list of usability issues and positive findings

• Ensure there is no redundancy.• Ensure you have both problem description and justification.

– Come up with recommendations for the issues.

Each evaluator independently assigns a severity rating to each issue.– The group decides which severity scale to use.

One evaluator combines all severity ratings, so that each issue has only one rating.

Format – Word, PPT etc.

StepsProject 1b

Page 61: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

61© April 19, 2023

Team members’ names and contributions to the project

Executive summary

(Brief) Introduction– Product evaluated– Objectives– Target users and context of use– Evaluation method (how was the evaluation conducted?)

Findings with severity ratings– Describe the problem and provide justification.

Recommendations– Describe proposed solutions and provide justifications.– Recommendations could accompany each finding or be

presented at the end.

What to Include in the ReportProject 1b

Page 62: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

62© April 19, 2023

Are all the sections there? (see previous slide)

Are all sections well executed?

Are the findings properly explained?

Are the findings as precise and concise as possible?

Do the findings have appropriate severity ratings assigned to them?

Do the recommendations appropriately address the findings?

Are the recommendations justified and actionable?

Is the report well structured and laid out so that it is visually pleasing and easy to read?– Proper alignment– Consistency of fonts, sizes, spacing, phrasing etc.

Grading CriteriaProject 1b

Page 63: HCI460: Week 2 Lecture September 16, 2009. 2 © September 10, 2015  Conferences  Other usability inspection methods: –Pluralistic walkthrough –Competitive.

63© April 19, 2023

Reading for Next Week Handbook of Usability Testing by Rubin:

– Chapter 2: What is usability testing?– Chapter 3: When should you test?– Chapter 5: Develop the test plan.