User Research on a Shoestring

43
user research on a shoestring susan teague-rector and erin white vcu libraries, richmond, va

description

 

Transcript of User Research on a Shoestring

Page 1: User Research on a Shoestring

user research on a shoestringsusan teague-rector and erin white

vcu libraries, richmond, va

Page 2: User Research on a Shoestring

What we'll coverImproving user experience of your (web) product through research before and during your design and development process.

  Not spending a lot of money.

      Taking the scary out of usability.

Page 3: User Research on a Shoestring

What we'll coverBefore you begin:Evaluating use of what you have   

• Heatmapping• Analytics

As you work:Assessment of user experience  

• Virtual user testing • Guerrilla testing• Microfeedback

Page 4: User Research on a Shoestring

Some context

Sound familiar? Large university (32,000 students), small Libraries web team (3 folks), and many projects (~1,000,000). Your situation in a nutshell: • There's never enough time;• spending the little money you have is a hassle, but• you still want to make data-driven design decisions.

 Optional addition:• Your boss wants you to justify user research.

 

Page 5: User Research on a Shoestring

Justifying a little research

If your supervisors still need coaxing (or if you do), here's why any teeny little bit of user research is going to make your life easier: • Research can significantly decrease development and maint

enance time, which means cost savings.

• Google is right. Focus on the user and all else will follow. • Happy users = brand buy-in. When users like the library

more, everybody's happy.

Page 6: User Research on a Shoestring

Let's get in it

We're going to use a case study of the last VCU Libraries web redesign to demonstrate some of the tools we used to do user research.

Page 7: User Research on a Shoestring

The year: 2009. The goal: redesign.

Page 8: User Research on a Shoestring

• Heatmapping• Analytics

 Combined with • Existing feedback • Library literature

• Virtual usability testing

• Guerrilla tests

• Heatmapping• Microfeedback

originalhomepage

prototypes new homepage

The plan

Page 9: User Research on a Shoestring

Heatmapping

• What are you measuring? o Where are people clicking? What is getting attention?

• What tool to use?  o We used CrazyEgg and tracked clicks on the homepage

for a week in the middle of spring semester.• What does it track? 

o Clicks based on network and browser properties and return visits, among others. 

• How does it work? o Insert the javascript into the page you want to measure,

then let CrazyEgg do the work.

Page 10: User Research on a Shoestring

Heatmapping results

Page 11: User Research on a Shoestring

Heatmapping: pros and cons

Pros• Robust statistics with minimal effort, $$$• Fast way to evaluate components of a page • Reports are easily PDF-able and aren't overly technical, so

you can easily have a document to send up to managers.• Invisible to users.

 Cons• Clicks don't tell the whole story. We know where people

click, but we don't know if folks are finding what they want, or if the page makes any sense in the first place.

Page 12: User Research on a Shoestring

We've done post-launch heatmapping, too.

Page 13: User Research on a Shoestring

Analytics

• What are you measuring? o What pages get the most traffic?o What search terms are people using to get to your site?o What are people searching for on your site? 

• What tool to use?  o We used Google Urchin and an in-house search query

tracker. (Removing personally identifiable data like IP.)  o Google Analytics does this as well.

• How does it work? o Urchin analyzes server logs of page visits, user

characteristics, and referring sites.o In-house tool records search terms into a database and

displays a word cloud of frequent terms.

Page 14: User Research on a Shoestring

Analytics: Urchin

Page 15: User Research on a Shoestring

Analytics: Search terms

Page 16: User Research on a Shoestring

Analytics: pros 'n' cons

Pros• Search terms tell us what pages are hard to find, and

ways users think about our site. • Page visits tell us what's popular, and where people are

coming from to get there.• Analytics tracking is invisible to users.

 Cons• These tools can't track user navigation paths through our

site. • Again, clicks don't tell the whole story. Use data only tells us

so much.

Page 17: User Research on a Shoestring

And now...Usability research.

(I found this on the internet years ago, and if you can tell me what this is from or who the artist is, please e-mail me. It's driving me nuts. - Ed.)

Page 18: User Research on a Shoestring

User experience: what we're here foror, what is UX?

Page 19: User Research on a Shoestring

Morville, P. The User Experience Honeycomb

What smart people say about UX

Peter Morville gets us thinking about UX as a honeycomb of  elements that together can form a positive user experience.

We list several UX resources at the end of the presentation, too.

Page 20: User Research on a Shoestring

Usability and UX: What's the diff?

• Usability is a component of User Experience, the part of this equation that you have control of.

Page 21: User Research on a Shoestring

Demystifying usability

Usability research just asks, "Does this thing make sense to other humans?"   There is no perfect tool.

Usability engineering is a process, not an end result.  Usability research can be easy, fun, fast, and inexpensive. 

And anyone can do it.

Page 22: User Research on a Shoestring

Krug on how we should think about usability studies:  some > none

done > perfect

sooner > later

OCLC # 61895021

This book rules (and is a fast read)

Page 23: User Research on a Shoestring

Goals of usability testing

Jakon Nielsen: Usability 101 • Learnability: How easy is it for users to do basic tasks the

first time they see the site?• Efficiency: Once users have learned the design, how

quickly can they perform tasks? [in that session]• Memorability: When users return to the design after a

period of not using it, how easily can they reestablish proficiency?

• Errors: How many errors do users make, how severe are these errors, and how easily can they recover from the errors? 

• Satisfaction: How pleasant is it to use the design?

Page 24: User Research on a Shoestring

So let's get crackin'

We tried to evaluate user experience through virtual usability testing with Usabilla, in-person guerrilla usability testing, and microfeedback forms.

All these methods are components of a UX toolkit. Individually, they are helpful, but together, they give a more complete picture.  There is no "right" answer, but with a combination of tools you can learn more than with one tool alone.

Page 25: User Research on a Shoestring

Virtual usability testing

• What are you measuring? The intangibles of user experience.o What doesn't make sense?o Where do you think you should click to perform X task?o What works?

• What tool to use?  o We used Usabilla. 5secondtest and a host of others

provide similar services.• How does it work? 

o Upload a screenshot of a web interface and create questions to ask about the interface. Questions can be task-based or feelings-based.

o Users visit the testing site on their own time and give feedback in the form of clicks and optional notes.

Page 26: User Research on a Shoestring

What do you like about this page?

Page 27: User Research on a Shoestring

What do you like about this page? (comments from users)

 Notes• nice white space around outside• nice centralized location for web 2.0• boxes arranged in tight enough formation that

they aren't just "a page of text"• dynamic (?) changing image centralized

Page 28: User Research on a Shoestring

What do you dislike about this page?

Page 29: User Research on a Shoestring

What do you dislike about this page? (comments from users)

  Notes • hours at top, directions bottom left, events bottom

right, services in the middle ... i'm lost• 'contact us' twice within inches of each other?• 'suggestions and feedback' (what's the

difference?) down here, 'ask us' up top• what could possibly be in 'about us?' it all seems

to be on the homepage ...• location of FOL thing near web 2.0 stuff is bad --

confuses def'n of "friend"• will this be consistent branding throughout?• not sure if people will understand library services

a-z w/o prompting

Page 30: User Research on a Shoestring

What do you dislike about this page?

  Notes • hours at top, directions bottom left, events bottom

right, services in the middle ... i'm lost• 'contact us' twice within inches of each other?• 'suggestions and feedback' (what's the

difference?) down here, 'ask us' up top• what could possibly be in 'about us?' it all seems

to be on the homepage ...• location of FOL thing near web 2.0 stuff is bad --

confuses def'n of "friend"• will this be consistent branding throughout?• not sure if people will understand library services

a-z w/o prompting

Changes we incorporated are highlighted.

Page 31: User Research on a Shoestring

Virtual usability testing pros and cons

Pros• Fast and easy for all involved.• Data on intangibles that we couldn't get through web stats. • Free-text suggestions as a catch-all for unforeseen issues. • Bonus: users could position free-text comments contextually

on the page, so we could easily figure out what the hell they were talking about when they said, "this thing here doesn't make any sense."

 Cons• It can be hard to find users who are invested enough to (a)

participate, and (b) give free-text responses.• Keep context in mind if you have a small, relatively

homogeneous sample (i.e. Library faculty).

Page 32: User Research on a Shoestring

Guerrilla testing

• What are you measuring? The intangibles of user experience: feelings, task completion, other thoughts about features and functionality.

• What tool do you use? Paper prototypes, basic web prototypes, or anything in between; and a little extraversion.

• How does it work?o Take advantage of the fact that your web users are the

same people who use your physical spaces. o Develop a short script, walk out into public spaces with

prototypes, seek out participants, and ask people questions.  

o Shut up and listen.

Page 33: User Research on a Shoestring

Example method

• Before you begin: develop a short script for use at the beginning and end of each interview, as well as a few questions for each user.

• Go out into the library and approach people who look like  they could spare a few minutes to talk. Try to target folks across populations - undergrads, grads, faculty.

• Shoot for a 5-minute interview. Keep it short, and try not to overwhelm.

• Don't worry about getting it perfect every time. • Write down everything you can - verbal and nonverbal

responses, gestures, facial expressions, etc. • Respect the user's time.• Listen.• Say thank you!

Page 34: User Research on a Shoestring

Example intro script

Hello, my name is _______ and I’m with VCU Libraries. We’ve been working on a redesign of the Libraries Web site and would love to get your feedback on the design. Would you be willing to participate in a short 5 minute survey?  We will not be collecting any personal information except your affiliation with the university.    This is also not a test of your abilities, but just a tool to help us understand how the site will work for our customers.   Do you have any questions before we begin?

Page 35: User Research on a Shoestring

• What is your affiliation with the university?• What single feature do like most about the web page?• What single feature do you like least about the web page?• How many Top Resources does the Libraries recommend?• In the redesign, how would you

o Find a book?o Find articles for your specific topic?o Place a hold or renew a book?o Order/find items that VCU Libraries does not own?o Contact a librarian?

• What single feature would you like to have available that you do not see included?

Example questions to ask:

Page 36: User Research on a Shoestring

Guerrilla testing pros and consPros• Most web developers have to seek out users; library users

are right here in our building.• It's an invaluable opportunity to talk to real users and learn

about problems you might not've even considered.• You will find usability issues FAST. Be prepared to be

flexible about addressing them.• You can repeat guerrilla tests as you iterate on designs.

 Cons• Users may not be invested enough to give in-depth

answers. Give small compensation if you can.• Be prepared to return to the drawing board (a good problem

to have!)• Be mindful of your biases when selecting participants.

Page 37: User Research on a Shoestring

Microfeedback

• What are you measuring? The intangibles of user experience during/after a new web product launch.

• What tool do you use? Small, 3-question quick feedback forms built in PHP. Some folks also use GetSatisfaction, UserVoice or Google Docs

• How does it work? o Create a prominent "Give Feedback" link to the survey, or

just embed the survey on your page.  Keep it small, fast and easy. Make targets big. Don't require fields. Let users decide what to tell you.

o Users submit quibbles/questions/comments.o Look at the comments, look for trends, and address

issues.

Page 38: User Research on a Shoestring

  

Love it

  

Like it

  

Indifferent

  

Confused

  

Hate it

Ease of use ° ° ° ° °Content ° ° ° ° °Layout ° ° ° ° °Overall ° ° ° ° °

The questions we asked:

Any other comments?

Page 39: User Research on a Shoestring

  

Love it

  

Like it

  

Indifferent

  

Confused

  

Hate it

Ease of use 93 65 20 21 38

Content 85 75 29 21 25

Layout 92 75 21 19 28

Overall 90 75 18 22 27

...and a host of free-text responses (n~=100).

Three months, 268 responses

Page 40: User Research on a Shoestring

Microfeedback pros and cons

Pros• Quickly pick up on unforeseen problems, broken links, and

usability issues.• Users can give as much or as little info as they want, fast.• Some users use microfeedback already and may be more

willing to use it (think Facebook's "like" feature). Cons• Some users thought this was a library help form. Be clear

about the form's scope when you post it.• The sample mostly includes lovers and haters, and not

many in between. Don't take mean responses (and there will be some) personally.

Page 41: User Research on a Shoestring

When doing user research, remember:  some > none

done > perfect

sooner > later

OCLC # 61895021

Remember this slide?

Page 42: User Research on a Shoestring

Thanks! and resources• User Experience Network • UXMatters • Garrett, J. J. The Elements of User Experience  • Krug, S. Don't Make Me Think• Morville, P. Ambient Findability• Nielsen, J. Designing Web Usability and UseIt.com• Designing Better Libraries • Adaptive Path • Quick and Dirty Remote User Testing

Page 43: User Research on a Shoestring

The authors

Susan Teague-RectorWeb Design Project LibrarianNCSU Libraries

Erin WhiteWeb Applications DeveloperVCU Libraries