1 Human-Computer Interaction Usability Evaluation: 2 Expert and Empirical Methods.

10
1 Human-Computer Human-Computer Interaction Interaction Usability Evaluation: 2 Usability Evaluation: 2 Expert and Empirical Expert and Empirical Methods Methods

Transcript of 1 Human-Computer Interaction Usability Evaluation: 2 Expert and Empirical Methods.

Page 1: 1 Human-Computer Interaction Usability Evaluation: 2 Expert and Empirical Methods.

1

Human-Computer InteractionHuman-Computer Interaction

Usability Evaluation: 2Usability Evaluation: 2Expert and Empirical MethodsExpert and Empirical Methods

Page 2: 1 Human-Computer Interaction Usability Evaluation: 2 Expert and Empirical Methods.

2

Lecture OverviewLecture Overview

• Expert evaluationExpert evaluation• Empirical evaluationEmpirical evaluation

• ObservationalObservational

• Survey evaluationSurvey evaluation

• Experimental evaluationExperimental evaluation

Page 3: 1 Human-Computer Interaction Usability Evaluation: 2 Expert and Empirical Methods.

3

Expert EvaluationExpert Evaluation

• Strongly diagnosticStrongly diagnostic• Overview of whole Overview of whole

interfaceinterface• Few resources needed Few resources needed

(except £ for experts)(except £ for experts)• CheapCheap• High potential return - High potential return -

detects significant detects significant problemsproblems

• Restrictions in role Restrictions in role playingplaying

• Subject to biasSubject to bias• Problems locating Problems locating

expertsexperts• Cannot capture real Cannot capture real

user behaviouruser behaviour

Advantages DisadvantagesAdvantages Disadvantages

Page 4: 1 Human-Computer Interaction Usability Evaluation: 2 Expert and Empirical Methods.

4

Observational EvaluationObservational Evaluation

• Quickly highlights Quickly highlights difficultiesdifficulties

• Verbal protocols Verbal protocols valuable source of valuable source of informationinformation

• Can be used for rapid Can be used for rapid iterative developmentiterative development

• Rich qualitative dataRich qualitative data

• Observation can Observation can affect user activity affect user activity and performance and performance levels. (Hawthorne levels. (Hawthorne Effect)Effect)

• Analysis of data can Analysis of data can be time-consuming be time-consuming and resource-and resource-consumingconsuming

Advantages DisadvantagesAdvantages Disadvantages

Page 5: 1 Human-Computer Interaction Usability Evaluation: 2 Expert and Empirical Methods.

5

Observational Evaluation Observational Evaluation (Cont’d) -(Cont’d) -

Cooperative EvaluationCooperative Evaluation (Monk (Monk et al.et al., 1992), 1992)

• Intended for use by non-HF expertsIntended for use by non-HF experts• User as collaborator in evaluation, not simply as User as collaborator in evaluation, not simply as

subjectsubject• User begins by thinking aloudUser begins by thinking aloud• Evaluator can answer and ask questionsEvaluator can answer and ask questions• Evaluation session produces a Evaluation session produces a protocolprotocol• Protocol transcription and analysis can be time-Protocol transcription and analysis can be time-

consumingconsuming

Page 6: 1 Human-Computer Interaction Usability Evaluation: 2 Expert and Empirical Methods.

6

Survey EvaluationSurvey Evaluation

• Addresses users’ opinions Addresses users’ opinions and understanding of and understanding of interface.interface.

• Can be made to be diagnosticCan be made to be diagnostic• Can be applied to users and Can be applied to users and

designersdesigners• Questions can be tailored to Questions can be tailored to

the individualthe individual• Rating scales lead to Rating scales lead to

quantitative resultsquantitative results• Can be used on a large group Can be used on a large group

of usersof users

• User experience is User experience is importantimportant

• Low response rates Low response rates (especially by post)(especially by post)

• Possible interviewer biasPossible interviewer bias• Possible response biasPossible response bias• Analysis can be Analysis can be

complicated and lengthycomplicated and lengthy• Interviews very time-Interviews very time-

consumingconsuming

Advantages DisadvantagesAdvantages Disadvantages

Interviews Questionnaires

Page 7: 1 Human-Computer Interaction Usability Evaluation: 2 Expert and Empirical Methods.

7

Experimental EvaluationExperimental Evaluation

• Powerful method Powerful method (dependent on the effects (dependent on the effects investigated)investigated)

• Quantitative data for Quantitative data for statistical analysisstatistical analysis

• Can compare different Can compare different groups of usersgroups of users

• Reliability and validity Reliability and validity goodgood

• ReplicableReplicable

• High resource demandsHigh resource demands• Requires knowledge of Requires knowledge of

experimental methodexperimental method• Time spent on experiments Time spent on experiments

can mean evaluation is can mean evaluation is difficult to integrate into difficult to integrate into design cycledesign cycle

• Tasks can be artificial and Tasks can be artificial and restrictedrestricted

• Cannot always generalize to Cannot always generalize to full system in typical working full system in typical working situationsituation

Advantages DisadvantagesAdvantages Disadvantages

Page 8: 1 Human-Computer Interaction Usability Evaluation: 2 Expert and Empirical Methods.

8

Guidelines for Experimental Guidelines for Experimental DesignDesign

• Decide measures of interest and hypothesesDecide measures of interest and hypotheses• Develop a set of representative tasksDevelop a set of representative tasks• Run a pilot studyRun a pilot study• Determine experimental designDetermine experimental design

• Within-subjects or between-subjectsWithin-subjects or between-subjects• Dependent and independent variablesDependent and independent variables

• Select sample(s) of typical subjects (size?)Select sample(s) of typical subjects (size?)• Experimental design to eliminate unwanted variables and Experimental design to eliminate unwanted variables and

effectseffects• Explain to subjects and run experiments(s)Explain to subjects and run experiments(s)• Collect objective and subjective dataCollect objective and subjective data• Compute statistics and analyseCompute statistics and analyse

Page 9: 1 Human-Computer Interaction Usability Evaluation: 2 Expert and Empirical Methods.

9

General PointsGeneral Points

• Evaluation is relevant throughout developmentEvaluation is relevant throughout development• Different methods most suited at different stages Different methods most suited at different stages

- rule-of-thumb:- rule-of-thumb:• Early design - paper-based only - analytic / expertEarly design - paper-based only - analytic / expert

• Prototype development - observational / experimentalPrototype development - observational / experimental

• Late development - surveyLate development - survey

• A mix of objective and subjective measures is A mix of objective and subjective measures is desirabledesirable

Page 10: 1 Human-Computer Interaction Usability Evaluation: 2 Expert and Empirical Methods.

10

Lecture ReviewLecture Review

• Expert evaluationExpert evaluation• Empirical evaluationEmpirical evaluation

• ObservationalObservational

• Survey evaluationSurvey evaluation

• Experimental evaluationExperimental evaluation