Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.
-
Upload
osborne-may -
Category
Documents
-
view
219 -
download
1
Transcript of Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.
![Page 1: Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.](https://reader036.fdocuments.us/reader036/viewer/2022062321/56649e4c5503460f94b4241d/html5/thumbnails/1.jpg)
Chapter 11: An Evaluation Framework
Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi
![Page 2: Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.](https://reader036.fdocuments.us/reader036/viewer/2022062321/56649e4c5503460f94b4241d/html5/thumbnails/2.jpg)
Introduction
Evaluation helps ensure that product meets the users’ needs
Recall HutchWorld & Olympic Messaging System (OMS) – chapter 10
What to evaluate? usability user experience
![Page 3: Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.](https://reader036.fdocuments.us/reader036/viewer/2022062321/56649e4c5503460f94b4241d/html5/thumbnails/3.jpg)
Chapter Goals
Key concepts & terms to discuss evaluation
Description of evaluation paradigms & techniques
Conceptual, practical, and ethical issues
DECIDE framework for evaluation
![Page 4: Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.](https://reader036.fdocuments.us/reader036/viewer/2022062321/56649e4c5503460f94b4241d/html5/thumbnails/4.jpg)
Evaluation Paradigms
Key terms- evaluation paradigms, user studies
4 core evaluation paradigms “Quick and dirt” evaluation Usability testing Field Studies Predictive Evaluation
![Page 5: Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.](https://reader036.fdocuments.us/reader036/viewer/2022062321/56649e4c5503460f94b4241d/html5/thumbnails/5.jpg)
Key Terms
User studies involve looking at how people behave in their natural environments, or in the laboratory, both with old technologies and with new ones.
Evaluation Paradigm is the set of beliefs which guide any type of evaluation
![Page 6: Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.](https://reader036.fdocuments.us/reader036/viewer/2022062321/56649e4c5503460f94b4241d/html5/thumbnails/6.jpg)
“Quick and Dirty” Quick & Dirty evaluation describes the
common practice in which designers informally get feedback from users or consultants to confirm that their ideas are in-line with users’ needs and are liked.
Quick & Dirty evaluations are done any time.
The emphasis is on fast input to the design process rather than carefully documented findings.
![Page 7: Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.](https://reader036.fdocuments.us/reader036/viewer/2022062321/56649e4c5503460f94b4241d/html5/thumbnails/7.jpg)
Usability testing Usability testing involves recording typical
users’ performance on typical tasks in controlled settings. As the users perform these tasks they are watched & recorded on video & their key presses are logged.
This data is used to calculate performance times, identify errors & help explain why the users did what they did.
User satisfaction questionnaires & interviews are used to elicit users’ opinions. Recall HutchWorld and OMS
![Page 8: Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.](https://reader036.fdocuments.us/reader036/viewer/2022062321/56649e4c5503460f94b4241d/html5/thumbnails/8.jpg)
Field Studies Field studies are done in natural settings The aim is to understand what users do
naturally and how technology impacts them.
In product design field studies can be used to:- identify opportunities for new technology- determine design requirements - decide how best to introduce new technology- evaluate technology in use.
![Page 9: Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.](https://reader036.fdocuments.us/reader036/viewer/2022062321/56649e4c5503460f94b4241d/html5/thumbnails/9.jpg)
Field Studies Two Approaches
Outsider – observing and recording what happens as an outsider looking in
Insider – participant in study that explores the details of what happens in a particular setting
![Page 10: Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.](https://reader036.fdocuments.us/reader036/viewer/2022062321/56649e4c5503460f94b4241d/html5/thumbnails/10.jpg)
Predictive Evaluation Experts apply their knowledge of typical
users, often guided by heuristics, to predict usability problems. Note: heuristics- design principles used in
practice Another approach involves theoretically
based models. A key feature of predictive evaluation is
that users need not be present Relatively quick & inexpensive
![Page 11: Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.](https://reader036.fdocuments.us/reader036/viewer/2022062321/56649e4c5503460f94b4241d/html5/thumbnails/11.jpg)
Key Aspects of each Evaluation Paradigm Table 11.1 – page 344 role of users who controls the process & relationship
during evaluation location when is it most useful to evaluate type of data collected & how it is analyzed how findings are fed back to the design
process philosophy that underlies these
paradigms
![Page 12: Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.](https://reader036.fdocuments.us/reader036/viewer/2022062321/56649e4c5503460f94b4241d/html5/thumbnails/12.jpg)
Evaluation Techniques Observing users Asking users their opinions Asking experts their opinions Testing users’ performance Modeling users’ task performance
to predict the efficacy of a user interface
![Page 13: Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.](https://reader036.fdocuments.us/reader036/viewer/2022062321/56649e4c5503460f94b4241d/html5/thumbnails/13.jpg)
Observing Users Techniques
notes audio video interaction log
![Page 14: Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.](https://reader036.fdocuments.us/reader036/viewer/2022062321/56649e4c5503460f94b4241d/html5/thumbnails/14.jpg)
Asking users their opinions Questions like:
what do you think about the product? does it do what you want? do you like it? does the aesthetic design appeal to
you? did you encounter problems? would you use it again?
![Page 15: Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.](https://reader036.fdocuments.us/reader036/viewer/2022062321/56649e4c5503460f94b4241d/html5/thumbnails/15.jpg)
Asking experts their opinions Use heuristics to step through
tasks Typically use role-playing to
identify problems It is inexpensive and quick to ask
experts rather than perform laboratory and field evaluations
![Page 16: Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.](https://reader036.fdocuments.us/reader036/viewer/2022062321/56649e4c5503460f94b4241d/html5/thumbnails/16.jpg)
User Testing Recall HutchWorld example Usually conducted in a controlled
environment Users perform well-defined tasks Data can be collected and
statistically analyzed
![Page 17: Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.](https://reader036.fdocuments.us/reader036/viewer/2022062321/56649e4c5503460f94b4241d/html5/thumbnails/17.jpg)
Modeling users’ task performance
Model human-computer interaction to predict the efficiency and problems in the design
This is successful for systems with limited functionality
Table 11.2 - page 347
![Page 18: Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.](https://reader036.fdocuments.us/reader036/viewer/2022062321/56649e4c5503460f94b4241d/html5/thumbnails/18.jpg)
DECIDE: framework Determine the goals the evaluation
addresses. Explore the specific questions to be
answered. Choose the evaluation paradigm and
techniques to answer the questions. Identify the practical issues. Decide how to deal with the ethical
issues. Evaluate, interpret and present the data.
![Page 19: Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.](https://reader036.fdocuments.us/reader036/viewer/2022062321/56649e4c5503460f94b4241d/html5/thumbnails/19.jpg)
Determining the Goals What are the goals of the evaluation? Who wants it and why? Goals influence the paradigm for the study. Some examples of goals:
Check that evaluators have understood user needs
Check to ensure that the final interface is consistent.
Investigate how technology affects working practices.
Improve the usability of an existing product .
![Page 20: Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.](https://reader036.fdocuments.us/reader036/viewer/2022062321/56649e4c5503460f94b4241d/html5/thumbnails/20.jpg)
Explore the Questions All evaluations need goals & questions to
guide them so time is not wasted on ill-defined studies.
For example, the goal of finding out why many customers prefer to purchase paper airline tickets rather than e-tickets can be broken down into sub-questions:- What are customers’ attitudes to these new tickets? - Are they concerned about security?- Is the interface for obtaining them poor?
![Page 21: Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.](https://reader036.fdocuments.us/reader036/viewer/2022062321/56649e4c5503460f94b4241d/html5/thumbnails/21.jpg)
Choose Evaluation Paradigm and Techniques
Evaluation Paradigms determine which type of techniques will be used.
Trade-Offs
Combinations of Techniques -HutchWorld
![Page 22: Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.](https://reader036.fdocuments.us/reader036/viewer/2022062321/56649e4c5503460f94b4241d/html5/thumbnails/22.jpg)
Identifying Practical Issues
For example, how to:
select users stay on budget staying on schedule evaluators select equipment
![Page 23: Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.](https://reader036.fdocuments.us/reader036/viewer/2022062321/56649e4c5503460f94b4241d/html5/thumbnails/23.jpg)
Decide on Ethical Issues Consideration for peoples rights. Develop an informed consent form Participants have a right to:
- know the goals of the study- what will happen to the findings- privacy of personal information- not to be quoted without their agreement - leave when they wish
“do unto others only what you would not mind being done to you”
![Page 24: Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.](https://reader036.fdocuments.us/reader036/viewer/2022062321/56649e4c5503460f94b4241d/html5/thumbnails/24.jpg)
Evaluate, Interpret, and Present Data
Reliability Validity Biases Scope Ecological Validity
![Page 25: Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.](https://reader036.fdocuments.us/reader036/viewer/2022062321/56649e4c5503460f94b4241d/html5/thumbnails/25.jpg)
Pilot Studies Pilot Study is a small trial run of the
main study. Pilot studies are always useful for
testing plans for an evaluation, before launching the main study
Often evaluators run several pilot studies.