Remote Usability Testing

25
Remote Testing Prague June 2006

description

Prague ACM SIGCHI presentation from 6/26/2006 about Remote usability testing by Karsten Skuppin

Transcript of Remote Usability Testing

Page 1: Remote Usability Testing

Remote Testing

PragueJune 2006

Page 2: Remote Usability Testing

page 2Remote Testing

Remote Testing

"History" (I)

§ Usability testing for Otto• Only specific areas / processes in the shop• Not the "whole customer experience"

§ Development of "seasonal testings"• Recruit participants that are about to buy something on otto.de• Observe them just doing that in the labs

§ Open issues• Testing is still not "real" enough

- Lab situation- Perform tasks with possibly low involvement- Not all relevant tasks in the scope of the lab test

Page 3: Remote Usability Testing

page 3Remote Testing

Remote Testing

"History" (II)

§ Idea: Remote-Testing• At home ("natural" environment)• Real involvement• "Natural" tasks (just what people want to do)

§ Challenges• Tool must work without installation on PCs of users• Tool must work without having to change anything on the server of

the client• We must be able to "observe" and "measure" (not only "ask")

Page 4: Remote Usability Testing

page 4Remote Testing

Remote Testing

How does Remote Testing work?

Participants

SirValUseUsabilityExpert

Recalls all recorded actions of the respondents

SurveyServer

Give free and scaled feedback

Web SiteProxy Server

Natural use of the target web site

Passes inquiries on to

Retrieves the participants' comments and evaluations

Page 5: Remote Usability Testing

page 5Remote Testing

Remote Testing

§ Entirely free and realistic utilisation of the site.Reported are "critical incidents" that …

• … trigger positive or negative sensations.

• … prove to be crucial in reaching the target, e.g. successful use of the search function or failure due to complicated texts.

Variants of Remote Testing

Critical Incidents

§ The respondents are confronted with tasks that they have to master either with or without a time limit.

§ The participants themselves may assess whether they have completed a task or not; the system is able to recognise multiple target conditions.

Task-oriented

Page 6: Remote Usability Testing

page 6Remote Testing

Remote Testing

What does the respondent see? Critical Incident

Call up feedbacksessions

Go back toquestionnaire

positive feedback negative feedback end visit

Page 7: Remote Usability Testing

page 7Remote Testing

Remote Testing

What does the respondent see? Critical Incident – positive feedback

cancel submit … and continue surfing.

You want to give us positive feedback?Please describe the event or your observation!

What was it that you wanted to do just before the event you want to tell us about?

Please describe the event or the observation and tell us what you liked about it.

Does the event help you in doing what you wanted to do - as described in (1)?

How much do you like this event?

very helpful somehow helpful not helpful at all

very much somehow not at all

Please describe short and precisely in your own words what you wanted to do.

Example

Please describe the event oryour observation in a waythat a good friend of yourswould understand it.

Example

Page 8: Remote Usability Testing

page 8Remote Testing

Remote Testing

What does the respondent see? Critical Incident – negative feedback

cancel submit … and continue surfing.

You want to give us positive feedback?Please describe the event or your observation!

What was it that you wanted to do just before the event you want to tell us about?

What happened? Please describe the event or the observation as detailed as possible.

Does the event prevent you in doing what you wanted to do - as described in (1)?

How much do you dislike this event?

very much somehow not at all

very much somehow not at all

Please describe short and precisely in your own words what you wanted to do.

Example

Please describe the event oryour observation in a waythat a good friend of yourswould understand it.

Example

What were your expectations? Do you have any ideas what we could improve?

What did you expect fromT-Online? What or howcould we do better?

Example

Page 9: Remote Usability Testing

page 9Remote Testing

Remote Testing

What does the respondent see? Task-oriented

TaskGo back to

questionnaire

task done

cancel task

You're looking for a solution from T-Online in order to protect your PC from viruses, worms or other dangers that may beconnected with the Internet. Please look for some information regarding the "SecurityPackage".

Page 10: Remote Usability Testing

page 10Remote Testing

Remote Testing

Was does the test head see? Screen shot, mouse traces, click points

Page 11: Remote Usability Testing

page 11Remote Testing

Remote Testing

Example 1: Critical Incidents on otto.de

§ How is otto.de "really" being used? How do the respondents proceed while browsing through, searching or purchasing?

§ Which elements or rather functions stimulate or obstruct purchase processes?

§ Which optimisation measures can be derived from these insights?

Questions

§ 60 respondents were recruited offline.• They were requested to use otto.de "in a normal

way" for 6 weeks, but only via the Remote system.

• While surfing they could report "critical incidents."

§ The remote system recorded all of the participants' actions as well as the pages they visited.

• As a result valuable clues regarding optimisation measures could be generated.

Method and Implementa-

tion

Page 12: Remote Usability Testing

page 12Remote Testing

Remote Testing

Example 2: Task-oriented benchmarking

§ As a means of further developing the search engine, one of our clients asked us to conduct a bench-marking that covered 7 search interfaces:

• Where are the strengths and weaknesses of the individual searches?

• Which search options are employed for which search?

• How long does it take to find a result and how satisfied are the participants with it?

Questions

§ Altogether 175 User were recruited via a panel.§ 75 of them employed 3 search interfaces each

(rotated presentation) through the remote system.• Certain search tasks had to be completed (e.g.

"Please search for a picture of the Cologne Cathedral"), while the remote system recorded the times, the options employed, etc.

Method and Implementa-

tion

Page 13: Remote Usability Testing

page 13Remote Testing

Remote Testing

Example 2: Task-oriented benchmarking: excerpted results

81%

82%

83%

87%

87%

88%

94%

Question: Did you find anything that would meet your requirements?

Search G

Search C

Search F

Search A

Search B

Search D

Search E

Basis: persons who tested the respective search engine (N=75)

Search Success

All Tasks

Page 14: Remote Usability Testing

page 14Remote Testing

0501000 20 40 60 80 100

Frequent UsersCustomersTotal

Remote Testing

Example 2: Task-oriented benchmarking: excerpted results

Average search duration(in seconds)

Basis: persons who successfully searched with the respective search engines

Search F (N=65)

Search D (N=62)

Search B (N=62)

Search C (N=66)

Search A (N=65)

Search G (N=71)

Search E (N=61)

Search Duration

All Tasks

Page 15: Remote Usability Testing

page 15Remote Testing

0 20 40 60 800% 20% 40% 60% 80%

Successful SearchCustomersTotal

Remote Testing

Example 2: Task-oriented benchmarking: excerpted results

When was the image search used in searching for a picture?

Basis: persons who tested the respective search engines (N=75)

Search D

Search E

Search C

Search A

Search G

Search F

Search B

Use of Image Search

Page 16: Remote Usability Testing

page 16Remote Testing

Remote Testing

Example 2: Task-oriented benchmarking: excerpted results

Basis: total, per person during all searches (N=525)

2 Times21%

3 Times15%

4 - 6 Times6%

Once14%

Never44%

How often were search sections employed?

Use of Search

Sections

Page 17: Remote Usability Testing

page 17Remote Testing

Remote Testing

Example 3: Click Tracking

Page 18: Remote Usability Testing

page 18Remote Testing

§ Four variants of a navigation scheme within an online shop were to be tested:

• Which navigation scheme leads to satisfactory results (users find products they would like to buy)?

• Which navigation scheme can be used effectively and efficiently (less clicks, less time, less problems)?

Remote Testing

Example 4: Testing of Variants

Questions

Method and Implementa-

tion

§ Online recruitment of 1,000 users per variant.• Users were asked to solve a task (same task for

each variant)• Remote tool to measure effectiveness and

efficiency• Short online questionnaire to assess satisfaction

Page 19: Remote Usability Testing

page 19Remote Testing

§ One of the biggest internet service providers in Germany wants to control the quality of their portals continuously:

• How do users assess content, usability, design etc. while using the portal?

• Is it possible for users to use the portal efficiently and effectively?

• How can the "total user experience" be improved so that users intensify their usage?

Remote Testing

Example 5: Control Quality of Portal

Questions

Method and Implementa-

tion

§ Online recruitment of 2,500 users per month.• 2,000 users solve 40 tasks to generate valid

measurements.• 500 users surf the portal via the "critical

incident" technique to generate ideas for optimization.

• Short reports every months, big reports every six months.

Page 20: Remote Usability Testing

page 20Remote Testing

Remote Testing

Research Questions

§ Critical incidents:• Capturing actual user behaviour• Adding "reasons" and "qualitative feedback" to your log files

§ Task oriented:• "Hard figures" needed• Variant testing• Benchmarking with competitors

§ General:• Generate log files• Quality control• Target groups (e.g. B2B) difficult to recruit• Target groups geographically wide spread

Page 21: Remote Usability Testing

page 21Remote Testing

Remote Testing

§ The test setting provides the utmost validity.

§ Many respondents can participate within a short time.

§ Participants stemming from very diverse geographical regions can be recruited.

§ Studies that require several appointments per respondent can be realised without much effort.

§ Participants are in their usual surroundings;they are not bound by fixed test dates.

Why Remote Testing?

Advantages over Lab Tests

§ "This tool made it possible for the first time to conduct a test directly in the living room of Otto.de users, i.e. in the environment in which buying experiences are created and buying decisions are made." (Steffen Kehr, Customer Experience Management, www.otto.de)

Customer Feedback

Page 22: Remote Usability Testing

page 22Remote Testing

Remote Testing

Does Remote Testing have any disadvantages over lab tests?

§ Motivations and problems cannot be explored in depth.

§ No opportunity to boost specific feedback.

§ Sessions cannot be adapted to the wishes and capacity of the participants during an interview.

§ No video tapes of the respondents' facial expressions or gestures.

Disadvantages over Lab Tests

§ Remote Testing is complementary to qualitative sessions.

§ Problems identified by means of Remote Testing can be thoroughly investigated during a lab test.

§ Problems identified in lab tests or by expert reviews assist in further interpreting the results of task-based Remote Tests.

Remote Testing and Lab Tests: a Useful

Combination

Page 23: Remote Usability Testing

page 23Remote Testing

Remote Testing

Automated Remote-Testing: User Experience Scorecard

Page 24: Remote Usability Testing

page 24Remote Testing

Remote Testing

Automated Remote-Testing: User Experience Scorecard

Page 25: Remote Usability Testing

page 25Remote Testing

Thank you for your attention!

SirValUse Consulting GmbHSchlossstraße 8g22041 Hamburg

Phone: +49 40 68 28 27 - 20Fax: +49 40 68 28 27 - 20

[email protected]