User Experience Research Methods in 3D: What to Use When...
Transcript of User Experience Research Methods in 3D: What to Use When...
UserExperienceResearchMethodsin3D:WhattoUseWhenandHow
toKnowYou’reRight
Christian Rohrer, PhD Senior Director of
User Experience Design Move, Inc. | REALTOR.com
Presented at BayCHI, SF chapter of ACM SIGCHI January 13, 2009 - Palo, Alto, CA
1 ©2009ChrisCanRohrer
Topics
• TheUserResearchLandscape• QualitaCveValidity• UserResearchClasses
• Desirability• TrueIntentStudies• UserExperienceandStrategy
2
Howthisgotstarted
3
http://www.useit.com/alertbox/user-research-methods.html
TheUserResearchLandscape:ThreeDimensionsofMethods
1. AStudinalvs.Behavioral
2. QualitaCvevs.QuanCtaCve
3. Contextofwebsiteorproductuse
4
Qualitative (direct) Quantitative (indirect)
Attitudinal
Behavioral
© 2008 by Christian Rohrer
What people say
What people do
Why & How to fix
How many & How much
Ques%onsansweredbyresearchmethodsbasedonDataSource&Approach
Approach
Dat
a S
ourc
e
5
TheAStudinalvs.BehavioralDimension
• AStudinalResearch– Understand,measure,orinformchangeofpeople’sstatedbeliefs
– OWencalled“self‐reported”data– OWenreliedonheavilyinmarkeCngdepartments– Examplemethods:Surveys,FocusGroups
• BehavioralResearch– Understandwhatpeopledowithminimalinterferencefromthemethoditself
– Examplemethods:DataMining/Analysis,Eyetracking6
TheQualitaCvevs.QuanCtaCveDimension
• QualitaCveResearch– Datatypicallygathereddirectlybyobservingtheuser– Researchercanaskfollow‐upquesCons,probeonbehavior,andpossiblyadjusttheprotocolasthestudyprogresses
– AnalysisofdataisnotmathemaCcal
• QuanCtaCveResearch– Datatypicallygatheredindirectlythrougharesearchinstrumentsuchasasurveyorwebserverlogs
– LargeamountsofdatathatcanbecodedandanalyzesmathemaCcally
7
TheAStudinalvs.BehavioralDimension
• Mixedapproaches– Inthemiddleofthespectrumarethetwomostpopularmethods:• Usabilitylabstudies• Ethnographicfieldstudies
– ThoughtheyoWenincludeamixofaStudinalandbehavioraldata,theyaregenerallybestforunderstandinguserbehavior
8
ResearchMethodsLandscape
9
UsabilityLabStudies
• ParCcipantsarerecruitedfor1‐on‐1sessionswheretheyaregiventasksandareaskedtocompletethemusingaprototypeorthelivesite
• ParCcipantsthinkaloudastheycompletetheassignedtasks
• Researcherobservesandnotestheirbehavior
• DataoWenrevealsusabilityissues,contentissues,andusers’mentalmodels
• OWendoneiteraCvelythroughoutthedesignprocess
10
ResearchMethodsLandscape
11
EthnographicFieldStudies• ParCcipantsareobservedintheir
naturalenvironment(mosttypicallyintheirhomes,offices,orwhereevertheyusetheproduct)
• Providesadeepunderstandingoftheirlifestyles,cultures,process,andwork‐aroundsasabasisforbegerunderstandingtheirneedsandproblems
• BestifdoneearlyinthedevelopmentprocesstohelpinformfeaturesandfuncConality
12
ResearchMethodsLandscape
13
FocusGroups• ParCcipantsareaskedabout
theirreacConstoaproduct,service,concept,brand,oradverCsementinasmallgroupseSng(usually6‐10people)
• Pros:– ProvidesaStudinaldatathatisuseful
formarkeCngandbrandissues– Caninformbigpictureproduct
strategy– UsefultoguidetheconstrucConof
subsequentaStudinalsurvey
14
Cons:– NotappropriateforevaluaCngproduct
usability– ImaginingtheirinteracConwiththeproduct
– PredicCngfuturebehavior
– “Groupthink”
– ParCcipantsswayedbyothers
– Culturalrulesandsocialnormsshaperesponses
15
SolomonAsch:SocialPressureandConformity
Asch, 1956
Conclusion: Focus groups are susceptible to groupthink
ResearchMethodsLandscape
16
ParCcipatoryDesign• ParCcipantsarerecruitedtoparCcipateinsmall
groupsessions(4‐8parCcipants)inwhichtheycompleteexercisesdesignedtohelpthemexpresstheircogniCve,emoConal,aspiraConal,andproceduralideasandissues.
• OWen,materialsareprovidedtoallowparCcipantstodiagramanddesignidealproductexperiences
• TheactofphysicallylayingoutwordsandimagesandthechoiceplacementinadiagramenablesparCcipantstoarCculatetheirideasmorethoroughlythantheycaninatypicalintervieworconversaCon.
• ParCcipantsdesignsareusedasanexplanatoryvehiclefortheirneeds,notasactualdesignspecs
• SessionsrequirealotofmaterialsandpreparaCon.
17
TheContextofProductUseDimension
• Natural– Goalistominimizetheinterferencefromthestudyinordertounderstand
users’naturalbehaviorandaStudes– Examplemethods:EthnographicFieldStudies,InterceptSurveys,DataMining
• Scripted– GoalistofocustheinsightsbyprovidingconsistencybetweenparCcipants– DegreeofscripCngcanvarywidely– Examplemethods:BenchmarkUsabilityStudies
• De‐contextualized/Productisnotused– Goalistoexamineissuesthatarebroaderthanusageorusability– Examplemethods:BrandorCulturalBehaviorStudies
• HybridApproaches– GoalistocreaCvelycombineproductuseduringthestudytomeetthe
researchgoals– Examplemethods:ParCcipatoryDesign
18
ResearchMethodsLandscape
19
Qualitative (direct) Quantitative (indirect)
Attitudinal
Behavioral
© 2008 Christian Rohrer
Approach
Dat
a So
urce
mix
mix
Scripted (often lab-based) use of product
Natural use of product De-contextualized / not using product
Key for Context of Product Use during data collection
Combination / hybrid
Focus Groups
Phone Interviews
Ethnographic Field Studies
Cardsorting
Diary/Camera Study
Intercept Surveys
Usability Lab Studies
Eyetracking
Usability Benchmarking (in lab)
A/B (Live) Testing
Online User Experience Assessments (“Vividence-like” studies)
Desirability studies
Data Mining/Analysis
Email Surveys
Message Board Mining Participatory Design Customer feedback via email
/
/
20
LandscapeofUserResearchMethods
Goal: Inspire new ideas; discover opportunities; inform strategy
Goal: Understand user goals and tasks; Improve/refine the design; reduce execution risk
Goal: Measure or compare against self or competition; feed back into strategy
TypicalProductDevelopmentStagesandClassesofResearch
Understand Conceive Design Develop Launch
Strategize Optimize Assess
Key Q: How well did we do?
Key Q: How shall we do it?
Key Q: What shall we do? ?
Ethnography Competitive analysis Feature/task analysis Develop/test concepts Market segmentation Brand strategy
Usability inspections Participatory design Iterative design & testing (RITE, paper
prototypes) Desirability studies Usability (lab) studies Ethnographic field studies
“Tracker” surveys (CSAT, Health, Loyalty, NPS)
Online User Experience Assessments
Usability benchmarks Data mining analysis Live (A/B) Testing Qualitative insights
QualitaCveValidity
22
Keypoints–Medlock,etal.,2002
• Pointedthedebateabout“howmanyusersareenough”inaverypragmaCcdirecCon– Focusshouldbeonfindingandfixingproblems– ImpactraCo=#problemsfound:#problemsfixed
• Alternateddesignandresearcheachday:
23
MON TUE WED THU FRI
UsabilityStudy–onv1ofdesign
Reviewofproblemsandre‐design
UsabilityStudy–onv2ofdesign
Reviewofproblemsandre‐design
UsabilityStudy–onv3ofdesign
Medlock,etal.,2002(cont).• Classifiedissuesinto4categories:
1. Obviouscause+obvioussoluCon+quickimplementaCon
2. Obviouscause+obvioussoluCon+slowimplementaCon
3. Noobviouscause,thereforenoobvioussoluCon4. Duetootherfactors(testscript,moderaCngproblem)
• Validity=“theobservaConisevaluatedagainsttheknowledgeandcriCcalthinkingabilityoftheobservers”
• Solvedcategory1problemsbetweenparCcipants– ImpactraCoincasestudy:97%(30/31)
• Mostusabilitystudiesforma%ve,notsumma%ve24
QualitaCveValidity
25
Full paper available under “Publications” on my site: http://www.xdstrategy.com/pubs/
Mike Katz, PhD
QualitaCveValidity
26
KeyTakeaways
• Thedebateabout“Howmanyusersareenough”hasconfusedpracCConers– FormaCveandsummaCvenotdisCnguishedwell– “Howmany”onlyrelevantwithsummaCve
• PracCConerserroneouslyusemathemaCcstoestablishvalidityinaqualitaCvemethod– “Lookforpagerns&trends…throwawaythosewithonlyoneortwoobservaCons”
• Whatif2outof6haveaproblem?Isitvalid?27
CriteriaforaValidUsabilityIssue*
1. TheparCcipantisrepresenta%veofthetargetusers
2. Thedifficultystemmedfromabehaviorthatwasreasonable,giventheproductdomain
3. Theproblemcanbeclearlydescribed
4. Theimpactcanbeclearlydescribed
5. Ara%onalaccountofthecauseoftheproblemcanbeprovided
28 *For formative usability studies, which comprise most usability research
Example:Yahoo!Personals*
29 *Courtesy of Jeralyn Reese of User Inspired (www.userinspired.com)
30
31
AValidIssue?
1. WastheparCcipantrepresentaCve?
2. Wasbehaviorleadingtodifficultyreasonable?
3. Wastheproblemclearlydescribed?
4. Wastheimpactclearlydescribed?
5. CouldaraConalaccountofthecauseoftheproblembeprovided?
32
ArecentexampleoneBay1. eBaySearch:“Webkinzseahorse”2. [SearchResultsPage]:<ClicklowestaucCon>3. [LisCngpage]:<Enterbid>4. [Confirmbid]:<ClickConfirmBid>
5. [BidConfirmaCon]:“You’vejustbeenoutbid!”1. MainopCons:IncreaseBid;Backtoitemdesc;
MyeBay;AskSellerQ;Viewseller’sotheritems
6. <Back><Back><Back>toSearchResultsPage7. Repeatfromstep2
33
Desirability
Definitions and
Research
34
DesirabilityasacontrastwithUsabilityandUClity/Usefulness
• Usability–Cantheproductbeused?• UClity/Usefulness–Doesitmeetaneed?
• Desirability–[mulCplemeanings]– Isitenjoyable?– DoIwanttouseit?– IsitaestheCcallypleasing?– DoesitevokeaposiCveemoConalresponse?– Doesitmovetowardourtargetbrandagributes?
35
36
Desirability
AsimplemodelofUserExperienceA good UX, at its core, must be useful…
Functionally, people must be able to use it…
The way it looks must be pleasing…
These support the overall brand experience
Research can and should be conducted at all of these layers
Usability
Utility It is useful
to me. It meets
my needs.
Marketing (campaigns, WOM, viral)
Brand experience
The boundary of User Experience?
DesirabilityStudies• BasedonMicrosoWProductReacConCards• Attheendofausabilitystudyorothersessionwherethe
parCcipantinteractedwiththedesign,parCcipantsarepresentedwith118productreacConwordsoncards.
• ProvidesparCcipantswithavocabularytobegerdescribetheirexperiencewiththeproduct
• QuanCtaCveAdaptaCon– Alargesampleofrespondentsaresentanonlinesurveywhere
theyreviewscreenshotsofthedesignandchoosefromthelistof118productreacConterms
Accessible Desirable Gets in the way Patronizing Stressful Appealing Easy to use Hard to use Personal Time-consuming Attractive Efficient High quality Predictable Time-saving Busy Empowering Inconsistent Relevant Too technical Collaborative Exciting Intimidating Reliable Trustworthy Complex Familiar Inviting Rigid Uncontrollable Comprehensive Fast Motivating Simplistic Unconventional Confusing Flexible Not valuable Slow Unpredictable Connected Fresh Organized Sophisticated Usable Consistent Frustrating Overbearing Stimulating Useful Customizable Fun Overwhelming Straight Forward Valuable
37
CaseStudy:Yahoo!PersonalsDesirabilityStudy*
Fun Modern
Simple
38
*Courtesy Jeralyn Reese and Michelle Reamy
CaseStudy:Yahoo!PersonalsDesirabilityStudy
• 3proposedvisualdesigndirecCons• QuanCtaCveDesirabilityStudy
– ParCcipantsrecruitedforaonlinesurveythroughemail
– PresentedwithproposeddesigndirecCon(betweensubjects)andaskedtoassociateproductreacContermswithdesign
– AnothergroupwasaskedwhichproductreacContermstheywouldwantinanonlinepersonalswebsite
39
CaseStudy:Yahoo!PersonalsDesirabilityStudy
Update: For more details, see Christian’s blog entry on Desirability Studies at: http://www.xdstrategy.com/blog/
40
CaseStudy:Yahoo!PersonalsDesirabilityStudy
41
Visualizingresults:AgributereporCng
42
Visualizingdata:Pairedopposites
43
Isthisreally“Desirability?”
44
EmoConalwant~28%
45
AestheCc~7%
Cultural/Community~10%
SeducCve~5%
Indemand~7%
GoesbeyondexpectaCons~8%
Useful~10% Usable~7%
“The extent to which users like, prefer or want to use a system.”
“The ‘wow’ factor: product has cutting edge features or functions and useres feel a connection to it.”
“When it fills a need and users want to use the product, it evokes a positive emotional response (‘enjoyable to use’).”
“Products that appeal emotionally through design, functionality and price simultaneously.”
“Going beyond the basic needs”
“When the interface is invisible.”
“The UI is simple and easy to use.”
“What I want and can just barely obtain.”?
“The more demand, the more desire.”
“Makes you feel like a part of something bigger.”
“Aesthetics - gets you attention and makes you look/feel good.”
“Usefulness (meeting needs) can create desirability.”
“Sexy.”
DesirabilityDefiniCons(cont.)
• “TheoutcomeofpursuingwhatIcallthe‘integraCveaestheCcexperience’‐anauthenCcbalanceofsensualstyle,mythicalstorytelling,pracCcaluClity,andtechnicalperformance.”
• “Toomuchoverlapwithusabilityandusefulness.Iprefereither(1)Enjoyableor(2)EmoConallyengaging.”
• “Itcomesdowntotheagributesyouwanttomeasure;otherwiseitwilljustgeneratenoise.”
46
TrueIntentStudies
• Atypeof“onlineuserexperienceassessment”forwebsites
• Measuresusers’iniCalintent+brandaffinity,showsbehavior/siteusage,measures“success”
• FirstcasestudypublishedatUPA2003byJeffSchuelerandScogKincaidofUsabilitySciencesCorporaCon(USC)
• NowofferedbyonlineuserexperienceconsulCngfirms,includingUSCandKeynote
47
Qualitative (direct) Quantitative (indirect)
Attitudinal
Behavioral
© 2008 Christian Rohrer
Approach
Dat
a So
urce
mix
mix
Scripted (often lab-based) use of product
Natural use of product De-contextualized / not using product
Key for Context of Product Use during data collection
Combination / hybrid
Focus Groups
Phone Interviews
Ethnographic Field Studies
Cardsorting
Diary/Camera Study
Intercept Surveys
Usability Lab Studies
Eyetracking
Usability Benchmarking (in lab)
A/B (Live) Testing
Online User Experience Assessments (“Vividence-like” studies)
Desirability studies
Data Mining/Analysis
Email Surveys
Message Board Mining Participatory Design Customer feedback via email
/
/
48
LandscapeofUserResearchMethods
“True Intent” or “Natural User Intent & Success” Studies
1.Uponsiteentry,randomsampleofusersaskedfor:• Intent• Demographics• Brandaffinity
3.Uponsiteexit,askusersfor:
• Success(perceived)
• Brandaffinity• NextSteps• Sugges%ons
2.Clickstreamdatainbetween
Data Set: Entry-Exit
Paired Surveys + Clickstream
data
49
Gather product information 31.9% Pricing & comparison shopping 20.3% Purchase a product or service 10.1% Other 9.7% Customer self-service or support 8.9% Use the site as an information resource 8.3% Browse the site 4.8% Get coupons, giveaways, or enter sweepstakes 4.5% Contact/communicate with the company 1.5%
n = 6,750 sessions
Visit intent varies across site types: Example from E-Commerce site
50
E‐Commerce:VisitSuccessVariesbyIntent
52% 31%
50% 28%
33% 55%
46% 42%
51
TrueIntentStudiesinPracCce• Canbetrickytodowell
– Samplingshouldberandom
– Recruitmentdifficultbecausedownloads(tocapturedata)greataffectdrop‐offrates
• Largerwebsiteshavebiggerbarriers– Lotsofbureaucracy– Securityandcodeefficiencyconcerns– Hightrafficcanbringdownconsultant’ssite
– NeedanexecuCvechampionoritwon’thappen• Greatwhenask,“Whydopeoplecometothesite?”
52
UserResearchandStrategy
53
Making User Experience Strategic
business visualization
personas sketches wireframes visual design
interactive prototypes
many ideas the idea
field studies focus groups
participatory design diary studies
usability desirability studies A/B testing
content strategy sample copy editorial guidelines
Research
Design
Strategize Optimize Assess version 2.0
UX
surveys
UI analysis color & imagery exploration
Design patterns and visual architecture
54
55
TheecosystemofinsightgeneraCon
Researchers Data creation:
Conduct or direct research
Analysts Data manipulation: Analyze data sets
Strategists Data review: Facilitate consumption of insights
How does each of these influence decision-making?
Design & Creativity Designers, Engineers
Design, build systems Informed by insights
Before/after analysis
Research & Analysis Anthropologists, Psychologists
Understand users & needs Usability & Desirability
Before/after design
Foundations of User Centered Design (UCD) • Combining the best of left-brain and right-brain thinking • Injection of user insights into the design process • Iterative cycles of analysis and creation
UER | Why is Research Part of Design?
Research for Design • Inspiring design innovation (ethnography) • Informing design direction • Assessing design solutions
Design for Research • Articulating early concepts to evaluate • Designing conditions for experiments
Understand Inspire
Design Evaluate
DesignThinking:AnalysisandCreaCvity
Opinion: UX Professionals can and should cross the line.
DesignThinking
• AGLafley(CEOofProctor&Gamble)ontypesofthinking:– Induc%ve:Basedondirectlyobservablefacts– Deduc%ve:Logicandanalysis,typicallybasedonpastfacts
– Abduc%ve:Imaginingwhatcouldbepossible.• Amethodofreasoninginwhichonechoosesthehypothesisthatwould,iftrue,bestexplaintherelevantevidence.
• AbducCvereasoningstartsfromasetofacceptedfactsandinferstheirmostlikely,orbest,explanaCons.
57
References• Benedek,J.,Miner,T.(2002). Measuring Desirability: New methods for evalua:ng desirability
in a usability lab se;ng.InProceedingsofUsabilityProfessionalsAssociaCon2002,Orlando,July8‐12.Link:hgp://www.microsoW.com/usability/UEPosCngs/ProductReacConCards.doc
• Benedek,J.,Miner,T.(2002).Product Reac:on Cards.Appendixtopublishedpaper“MeasuringDesirability”inProceedingsofUsabilityProfessionalsAssociaCon2002,Orlando,July8‐12
• Katz,MichaelA.andRohrer,ChrisCan.HowManyUsersAreReallyEnough?–WhattoReport:DecidingWhetheranIssueisValid.User Experience,Vol.4,Issue4(Fall2005).
• Katz,MichaelA.andRohrer,ChrisCan.HowManyUsersAreReallyEnough…AndMoreImportantlyWhen?Unpublishedpaper(2004)availableathgp://www.xdstrategy.com/pubs
• Lafley,A.G.(2008).The Game‐Changer: How You Can Drive Revenue and Profit Growth with Innova:on,CrownBusiness.
• Medlock,M.C.,Wixon,D.,Terrano,M.,Romero,R.L.,&Fulton,B.Using the RITE method to improve products: A defini:on and a case study.UsabilityProfessionalsAssociaCon,Orlando,FL,July2002.evaluaCon:Howmanysubjectsisenough?HumanFactors,34,4(1992),457‐468.
• Schueler,J.andKincaid,S.GatheringROIandVisitorSuccessRateDirectlyfromSiteVisitors.Usability Professionals Associa:on, ScoOsdale, AZ, June 2003.
• Williams,D,Kelly,G.andAnderson,L(2004).MSN 9: new user‐centered desirability methods produce compelling visual design.ProceedingsofACMSIGCHI,2004,Vienna,April,2004.
58
ThankyouandQuesCons
ThankstothefollowingUXprofessionalswhoinfluenced&contributedtoideaspresented:
AaronMarcus|BethLeber|CarolFarnsworth|DeborahAdams‐EstradaElissaDarnell|ErikOjakaar|FrankGuo|GoliJendreski|HarleyManningJacklynSchrier|JakeCressman|JakobNielsen|JeffHerman|JennAnderson|JenniferMoore|JeralynReese|JohnSanborn|KlausKaasgaard|LoriHobson|MarinaNaito|MikeKatz|MikeLee|MohanDasannacharya|MonalChokshi|PaulFu|RianVanderMerwe|SarahCulberson|TracyHayes|UdayGajendar
59