IWMW 2006: User Testing on a Shoestring Budget (1)

13
A centre of expertise in digital information management www.ukoln.ac.u k UKOLN is supported by: Usability on a Shoestring Budget (1) Emma Tonkin & Greg Tourte Software & Systems, UKOLN HCI Department, University of Bath www.bath.ac.u k

Transcript of IWMW 2006: User Testing on a Shoestring Budget (1)

Page 1: IWMW 2006: User Testing on a Shoestring Budget (1)

                                                             

A centre of expertise in digital information managementwww.ukoln.ac.uk

UKOLN is supported by:

Usability on a Shoestring Budget (1)

Emma Tonkin & Greg TourteSoftware & Systems, UKOLNHCI Department, University of Bath

www.bath.ac.uk

Page 2: IWMW 2006: User Testing on a Shoestring Budget (1)

                                                             

A centre of expertise in digital information managementwww.ukoln.ac.uk

Introduction• UKOLN, the University of Bath• HCI Group• Why this session?

Page 3: IWMW 2006: User Testing on a Shoestring Budget (1)

                                                             

A centre of expertise in digital information managementwww.ukoln.ac.uk

Why do projects fail?Project Impaired Factors % of the Responses1. Incomplete Requirements 13.1%2. Lack of User Involvement 12.4%3. Lack of Resources 10.6%4. Unrealistic Expectations 9.9%5. Lack of Executive Support 9.3%6. Changing Requirements & Specifications 8.7%7. Lack of Planning 8.1%8. Didn't Need It Any Longer 7.5%9. Lack of IT Management 6.2%10. Technology Illiteracy 4.3%11. Other 9.9%

Page 4: IWMW 2006: User Testing on a Shoestring Budget (1)

                                                             

A centre of expertise in digital information managementwww.ukoln.ac.uk

Introducing usability• Definition: the measure of a product’s

potential to accomplish the goals of a user• How easy a user interface is to understand

and use• Ability of a system to be used [easily?

Efficiently? Quickly?]• The people who use the project can

accomplish their tasks quickly and easily

Page 5: IWMW 2006: User Testing on a Shoestring Budget (1)

                                                             

A centre of expertise in digital information managementwww.ukoln.ac.uk

Assumptions• There are several dimensions to

usability– Focus on users– ‘People use products to be productive’– Users are busy people trying to

accomplish tasks quickly– Users decide when a product is easy to

use• (Adapted from Redish & Dumas, A Practical Guide to User Testing)

Page 6: IWMW 2006: User Testing on a Shoestring Budget (1)

                                                             

A centre of expertise in digital information managementwww.ukoln.ac.uk

However…• Are users always busy? Does this imply that

usability is only present in the workplace?!• Effectiveness vs. efficiency vs. satisfaction• Do users know when a product is ready?• Do all users agree about usability?• Is usability actually measurable?• Is there one statistic that == ‘% usability’?

Page 7: IWMW 2006: User Testing on a Shoestring Budget (1)

                                                             

A centre of expertise in digital information managementwww.ukoln.ac.uk

Elements of usability

• Nielsen refers to five elements or components of usability:– Learnability– Efficiency– Memorability– Errors– Satisfaction – Usability Engineering, 1993, p.26

• These may not be of equal importance in all cases.

Page 8: IWMW 2006: User Testing on a Shoestring Budget (1)

                                                             

A centre of expertise in digital information managementwww.ukoln.ac.uk

In other words…• Usability depends on context

– What does the user want to do?– Who is the user?– What’s the user’s perspective on life?

• Related to:– Internationalisation; cultural, social– Task analysis; working out what the user

wants to do (what the goal is) and how he/she would expect to accomplish it

Page 9: IWMW 2006: User Testing on a Shoestring Budget (1)

                                                             

A centre of expertise in digital information managementwww.ukoln.ac.uk

Science vs craft• Formal approaches:

– Research-driven– ‘hard science’ – Laboratory-based

• Informal approaches:– Naturalistic, qualitative observations– Informal setting

Page 10: IWMW 2006: User Testing on a Shoestring Budget (1)

                                                             

A centre of expertise in digital information managementwww.ukoln.ac.uk

A note about automated testing/validation

• ‘Should be’ vs ‘is’ – model vs reality

• Great handwriting does not guarantee a compellingly readable result

• Temptation to test the (computationally) obvious

Page 11: IWMW 2006: User Testing on a Shoestring Budget (1)

                                                             

A centre of expertise in digital information managementwww.ukoln.ac.uk

Scenario-based user testing • Based around tasks• Simple scenarios (‘hypothetical

stories’/’abstract-level test cases’):– For a company web page, locating and

using contact details– Registration and login to a wiki

• Process: provide a task and ask the user to complete it– It is important to test the right tasks!

Page 12: IWMW 2006: User Testing on a Shoestring Budget (1)

                                                             

A centre of expertise in digital information managementwww.ukoln.ac.uk

The test process• A facilitator with detailed knowledge about the site/software

is chosen to oversee the test– They must take care not to influence the user’s behaviour!

• The tester (user) is briefed about the site/software• They then go through each scenario

– ‘Think-aloud method’ – describing and explaining actions– ‘Talk-aloud method’ – describing without explanation

(considered more accurate)

• The facilitator keeps notes and prompts the user where necessary

• Alternatively/additionally, the session can be videoed

Page 13: IWMW 2006: User Testing on a Shoestring Budget (1)

                                                             

A centre of expertise in digital information managementwww.ukoln.ac.uk

Creating scenarios• Must be:

– Motivating– Credible– Complex– Provide easy-to-evaluate results

• An Introduction to Scenario Testing, Cem Kaner, Florida Tech, June 2003

– Can be gleaned from documented requirements?