UX Research: What They Don't Teach You in Grad School

Post on 03-Dec-2014

492 views 1 download

Tags:

description

Three case studies on UX techniques and methodologies that will inspire, amaze, and possibly strike fear. But, through it all, lessons learned from the field and fundamentals of UX research will be presented. The goal is to depart with practical perspectives and sufficient rigor to guide a course towards a customer aware corporate strategy. *Please note we had technical difficulties during the Q&A so we were unable to 'close out' properly but the presentation was recorded without issue.*

Transcript of UX Research: What They Don't Teach You in Grad School

UX Research: What I did not learn in grad school…Gavin Lew

Executive Vice President, GfK User Centric

@glew

GfK User CentricChicago, October 2012

Introduction

2© April 9, 2023 – Proprietary and Confidential

3

A little about me…

Gavin S. Lew

4© April 9, 2023 – Proprietary and Confidential

Adjunct Faculty

5© April 9, 2023 – Proprietary and Confidential

Adjunct Faculty

6© April 9, 2023 – Proprietary and Confidential

7© April 9, 2023 – Proprietary and Confidential

8© April 9, 2023 – Proprietary and Confidential

9© April 9, 2023 – Proprietary and Confidential

I did not finish my PhD

10© April 9, 2023 – Proprietary and Confidential

What Happened?

11

What Did I Learn in Grad School?

Picture of Grad School

Overview

?

12© April 9, 2023 – Proprietary and Confidential

What Did I Know?

13

Founded User Centric

Started UC back in 1999…we are now:– GfK User Centric

• 150+ global UX consultants with post-graduate degrees in behavioral sciences, human factors, or human-computer interaction

14

Philosophy 1.0

At UC, we don’t “sell services” like– Usability testing– Or other user experience forms of research

What we strive to do is answer client questions– Methodologies and techniques are just tools

15

We have the Privilege of…

Interacting with,

Designing for,

and Testing

many User

Experiences

But, this also means…

16

Sometimes I Feel Like Clients Ask Us to…

But, most ofour work…

17April 9, 2023

Slide Title [1 of 2]

Section Title

Slide content

This SUCKS!!!

18

Philosophy 2.0

We believe that any business can be successful

If they could just…

Take a bite out of suck

We take projects where we can have

a positive impact with our clients

One that transforms their user’s experience

19

Techniques (User Experience)

20

But, What Does It Really Mean?

Picture of thinker

Overview

Session Topics

21© April 9, 2023 – Proprietary and Confidential

22

What They Do Not Teach You in Grad School

1. “The experience you craft is more than just the product”

2. “Yes, usability can be measured…”

3. “Sometimes research is COMPLICATED”

4. “Design is not always walk-up-and-use”

23© April 9, 2023 – Proprietary and Confidential

1. “The experience you craft is more than just the product”

Product or

Service

Web Site

Out-of-the-Box

ExperienceProduct

or Service

IVR

User Guide

Store Experience

e-com

Call Centers

Web Site

Out-of-the-Box

ExperienceProduct

or Service

IVR

User Guide

Store Experience

e-com

Call Centers

HR

Paper Bills

28© April 9, 2023 – Proprietary and Confidential

29April 9, 2023

30

Philosophy 3.0

Must think beyond just the product itself…

We Believe Experiences Matter

31 2. “Yes, Usability can be MEASURED”

32

Must Fight Naysayers

As UX practitioners, we believe that concepts, such as easy-to-use, intuitive, and usable can be measured

Unfortunately, naysayers believe that we cannot measure

”I know it when I see it”

Ultimately, what we do is MEASURE and CHANGE

Does Anyone See More Than Snow?

33© April 9, 2023 – Proprietary and Confidential

Anyone See a Dog?

Once you see it, you cannot help but see it

Dalmatian “pops”

FedEx Spinner

Who sees it?

More help

FedEx Spinner is “now” locked

Spinner just “pops”

Forgive me. I just ruined your commute!

42© April 9, 2023

“You just know it when you see it”

Naysayers: You Cannot Measure Usability

43© April 9, 2023

Correct this belief

Measurement must be:

– Well defined

– Observable

– Quantifiable

– Repeatable

Naysayers: Cannot Measure “Intuitiveness”

44© April 9, 2023

Not All Measures Are Created Equal

Bad Inappropriate Good

`

45© April 9, 2023

Consider a Horse Race: Which Measure is Good?

Consider horse racing: What measure is used?

46© April 9, 2023

Different Conditions: Yes, That is Snow

Yes, this is snow…!!!

47© April 9, 2023 – Proprietary and Confidential

Some believe the horse wants to win…

48

Procedure

– Will give you a task

– I will count time

– Make a Yes or No decision

– Remember the time

– Raise your hand

• Right = Yes / Left = No

Let’s practice first

Exercise

49

Is the man wearing a red shirt? Decide Yes or No Remember the time Raise hand Ready?

Decide Yes or No Note time Raise hand

Exercise

50

1 sec

51

2 sec

52

3 sec

53

4 sec

54

5 sec

55

6 sec

56

7 sec

57

8 sec

58

9 sec

59

Hospital setting. Assessment of different prompts for an interaction with an interface. The patient has declined further treatment. The physician asked you to go into the system and cancel all of the orders. So, you select the orders and press cancel.

Decide Yes or No Note time Raise hand Ready?

Task 1

60

1 sec

61

2 sec

62

3 sec

63

4 sec

64

5 sec

65

6 sec

66

7 sec

67

8 sec

68

9 sec

69

10 sec

70

Hospital setting. Assessment of different prompts for an interaction with an interface. The patient has declined further treatment. The physician asked you to go into the system and cancel all of the orders. So, you select the orders and press cancel.

Decide Yes or No Note time Raise hand Ready?

Task 2

71

1 sec

72

2 sec

73

3 sec

74

4 sec

75

5 sec

76

6 sec

77

7 sec

78

8 sec

79

9 sec

80

10 sec

81

You can feel easy-to-use

You need to design this way

Why Does This Matter?

You can feel bad interfaces and

also measure the effect

82© April 9, 2023 – Proprietary and Confidential

3. “Sometimes research can be

COMPLICATED…”

83

One Can Always Get Data…

1) Is this really “good” data?

2) Will the results produce actionable change?

Integrity is everything in research

84

Pick a Device, Any Device!

85

Sanitization MP3 Players

Five selected target interfaces

Alpha

Bravo

Charlie

Delta

Echo

86

Client Objective

Client asked User Centric to:

Understand the user experience related to these devices

87

Client Objective

Client asked User Centric to:

Understand the user experience related to these devices

Identify usability issues

88

Client Objective

Client asked User Centric to:

Understand the user experience related to these devices

Identify usability issues

Recommend possible solutions to improve the UI

89

Challenge: Help create a best-in-class UI

Sounded reasonable to us

90

Effort Seemed to be Largely Formative

Discovery emphasizes the qualitative

Testing is pragmatic with small samples to iterate the design

But, not this case example because…

91

Case Study #1

But, the client had

different needs

92

Design Research Included:

21 Tasks of Interest were selected– High Frequency of Use (“Play Song”)– Priority (“Create Playlist”)

5 User Interfaces – Four Competitors– One Client Design (Echo)

Alpha Bravo Charlie Delta Echo

User Centric, Inc. UPA: June 2006 93

Task x Device Matrix

94

Case Study #1

Client had even more needs…But, the client had

EVEN MORE needs

95

Asked to Conduct Research to:

Ensure that the new design will be best-of-breed relative to the competition

96

Asked to Conduct Research to:

Ensure that the new design will be best-of-breed relative to the competition

Oh, we failed to mention— This is extremely high profile Data will drive strategy Report will go directly to C-level executives…

And oh, did I mention that we’re gonna need the results to be statistically significant…?

97

Initial Design

21 Tasks x 5 Designs– 100+ Combinations

Picked: Competitive Usability Testing

Within-subjects design -- NOT– Learning– Fatigue

Between-subjects design…? How?

98

When We Looked Closer…

It actually got worse!

99

Not All of the Tasks Worked for Each Device!

100

And even worse…

100

Case Study #1

Client had even more needs…

But it gets even better!

The client had another UI variant to add…

101

And New Baby Makes Six!

102

What else can I say?

103

Case Study #1

103

And even worse…

103

Case Study #1

Client had even more needs…

Could we try talking the company out ofdoing these things?

Nope!

Results will drive strategy

104

We Did Manage to Convince Company that

UC would design the study such that it would be sensitive enough to detect statistical significance, if it indeed existed

Thus, there would be NO a priori assurances of finding significant differences—because it might not exist!

105

The Real Challenge…

We were charged with:

– Research activities

– Methodology that would provide data to justify design direction

A device was going to be built…

106

Core Elements

Access points (navigating to a feature is easy)

Feature task flows (completing task may be

hard)

Design look and feel

Iconography

Verbiage

107

Core Elements

Access points (navigating to a feature is easy)

Feature task flows (completing task may be

hard)

Design look and feel

Iconography

Verbiage

108

Experimental Approach

Realistically, participants could– Only interact with 2-3 designs effectively (not 5-7) – Assumed each participant could complete ~ 6 tasks

Create prototypes on a computer– Level the “playing field” to core task flow elements

Usability testing / Quantitative Data Collection– Recruited target demographic (incl. high schools) – Needed simultaneous test teams– In the end, we really needed to know the “story”

109

Sheer size of all possible combinations

Within each block… Order of task presentation

– Tasks were systematically counter-balanced to reduce learning and order effects

Order of device presentation– For each participant, devices were randomized within

each task to reduce learning and order effects

Control for Bias: Create Blocks

110

Individual participants received a block– Assigned randomly to reduce learning effects

Constraint: Familiarity biases were avoided – iPod owners will not interact with iPods

Est. what could go into a 60-min session– Blocks of four to six tasks seemed to “fit” – Using total number of steps to complete…

Participants were Assigned to Blocks

111

If each participant received five or six (of the 21) tasks by three (of the six) devices– There are 20 unique device combinations

• P1 = Devices ABC; P2 = ABD; P3 = ABE…

To complete a full block, 20 participants were needed

Each device by task cell was represented with at least 10 participants– N=80 to have four blocks of 20

So…How Many Participants?

112

Julian Tracy O.

Participant #1 Participant #2

16-24 years 25-44 years

task="11" device=Bravo task="16" device=Bravo

task="11" device=Foxtrot task="16" device=Delta

  task="16" device=Charlie

task="6" device=Alpha task="13" device=Charlie

task="6" device=Foxtrot task="13" device=Delta

task="6" device=Bravo task="13" device=Bravo

task="4" device=Bravo task="8" device=Delta

task="4" device=Foxtrot task="8" device=Bravo

task="4" device=Alpha task="8" device=Charlie

task="15" device=Foxtrot task="18" device=Bravo

task="15" device=Bravo task="18" device=Delta

 

task="17" device=Foxtrot task="9" device=Charlie

task="17" device=Bravo task="9" device=Bravo

task="17" device=Alpha task="9" device=Delta

So, This Looked Like This…

113

Time-on-Task

Efficiency (Deviation from Optimal Path)– Total screens viewed / Optimal path for the task

• More incorrect “steps” increases this metric

Success– % participants in each cell (device x task) who

successfully completed the task

Preference– Pair-wise device preferences for a particular task with

a magnitude judgment

Measures

114

Task-by-Task Analysis

115

Sometimes numbers do

not tell the whole story

116

Sometimes the runner up Ain’t that Bad

117© April 9, 2023 – Proprietary and Confidential

What was the REAL STORY…?

N = 80?... I asked for 100+

118

Usability Issues / Participant Verbalizations

119

Results Drove Iterative Design Process

Start with the high frequency / high priority tasks…– Why did the Foxtrot design win?– Why did the Alpha design lose?– Compare quantitative with qualitative

Complex tasks – The fastest time was not the best– More clicks, less error, high satisfaction

Sometimes winners would emerge for different reasons…– How do you weigh different UI conventions?

120

Lessons Learned

“Know the story”– Benefit of Qualitative Data– Absolute “must have”– Extra 20 participants used for qualitative data

Learning– Counterbalancing was sufficient– But, these are not walk-up-and-use devices…

Avoid “Frankenstein” Design– Do not simply pick the winning task flow and

implement– Consistency matters!

Group Exercise: 3 of 3How to Write Good Recommendations

All too often, we over focus on making things

SIMPLE

122© April 9, 2023

Walk Up And Use

4. “Design is NOT always aboutwalk-up-and-use”

123

Not Everything Should Be WALK UP AND USE

Expert interfaces are around us everywhere

All too often…

we design for the first hour of use

NOT the first year of use

124

Call Center Interface

125

Call Center Interface

126

While on a Call, Knowing “History” Usually Helps

Click Notes

127

Specific Notes Can Be Opened

128

Specific Notes Can Be Opened

129

Imagine…– Approx. 100 calls per day– Typical environment involves cubicle farms– Stand and ask questions– Multi-tasking– Time pressure– Possible sales incentives in effect– Rapid consumption of screens

Does this change anything?!?!

130

Demonstrate over-learned behaviors– High number of transactions, huge volume of calls– Rote memorization of commands and actions

Emphasis is all about their workflow – They make transactions so quickly, across multiple

systems, and in most cases, they do not need to look at entire screen

– IMPACT: Users will not look at individual notes and they will be less informed, thus driving calls back!!!

Reality: – Traditional walk-up-and-use methods may be totally

inappropriate and insufficient

Expert Users

131

Main Screen

132© April 9, 2023 – Proprietary and Confidential

Takeaway Lessons

133

Core Issues

We Change

We MeasureThank you

s

134© April 9, 2023 – Proprietary and Confidential

1. “The experience you craft is more than just the product” 2. “Yes, usability can be MEASURED””

4. “Design is NOT always about walk-up-and-use””

3. “Sometimes research can be COMPLICATED…”