Research and Analysis Methods October 5, 2006. Surveys Electronic vs. Paper Surveys –Electronic:...

16
Research and Analysis Methods October 5, 2006

Transcript of Research and Analysis Methods October 5, 2006. Surveys Electronic vs. Paper Surveys –Electronic:...

Page 1: Research and Analysis Methods October 5, 2006. Surveys Electronic vs. Paper Surveys –Electronic: very efficient but requires users willing to take them;

Research and Analysis Methods

October 5, 2006

Page 2: Research and Analysis Methods October 5, 2006. Surveys Electronic vs. Paper Surveys –Electronic: very efficient but requires users willing to take them;

Surveys• Electronic vs. Paper Surveys

– Electronic: very efficient but requires users willing to take them; possibility of technical problems; concerns about privacy

– Paper: fewer concerns about confidentiality but often low response rates

• Issue of length > how do you motivate people to fill your survey in?– Rewards > Introduces possibility of bias– Observer bias: act of measuring changes the

thing being measured

• Versatile: many types (user satisfaction, w usability test, etc.)

Page 3: Research and Analysis Methods October 5, 2006. Surveys Electronic vs. Paper Surveys –Electronic: very efficient but requires users willing to take them;

Interviews vs. Focus Groups

• Interview: one person at a time; captures individual differences (individual subjectivity)– Problems: hard to generalize, hard to compare

with other interviews (need schedule of questions)

• Focus groups: many people at one time; people stimulate responses within the group; can come to some group consensus– Problems: self-censorship; lack of privacy and

confidentiality

Page 4: Research and Analysis Methods October 5, 2006. Surveys Electronic vs. Paper Surveys –Electronic: very efficient but requires users willing to take them;

Observation• Go into the work site and watch people using

a website; you may see– Problems (and how people solve them)– Use of secondary information (e.g. people need

to look up words > add a glossary function)– Frequency of use of different parts of the site

• Generally, observations are not directed – usability tests use directed scenarios – info for

scenarios often comes from user observation

• Problems: getting access to worksites; little value if site is only used occasionally

Page 5: Research and Analysis Methods October 5, 2006. Surveys Electronic vs. Paper Surveys –Electronic: very efficient but requires users willing to take them;

Market Research• Much research already available on general

characteristics of some user groups: – Students– Yuppies– Men vs. women

• Good for demographic information (info about the larger population)– Can help identify characteristics of sample to

recruit for surveys, interviews, usability tests

• Problems: often little guidance for usability decisions (navigation, choice of info on site, etc.)

Page 6: Research and Analysis Methods October 5, 2006. Surveys Electronic vs. Paper Surveys –Electronic: very efficient but requires users willing to take them;

Site Usage Statistics

• Possible to get information from the server– Who is using the website (IP address of

computer)– What pages are being accessed

• Problems: – Can’t just count the number of times a paper is

called from the server > could just be someone moving back and forth within the site

– IP addresses help you identify different users but nothing about their demographics or needs

Page 7: Research and Analysis Methods October 5, 2006. Surveys Electronic vs. Paper Surveys –Electronic: very efficient but requires users willing to take them;

Comparative Evaluation

• Ask target users what sites (and what features of other sites) they like– Identify characteristics of those sites and

compare them with your own

• Look at other sites in your segment– Assumption: you are all trying to get the

attention of the same target audience– Need to be able to match the functions, text,

images that they use

Page 8: Research and Analysis Methods October 5, 2006. Surveys Electronic vs. Paper Surveys –Electronic: very efficient but requires users willing to take them;

Usability Tests• For sites that already exist• Identify specific problems through task

scenario testing – Pick a typical task (perhaps from observation)– Ask user to complete task and talk their way

through the steps (think-aloud protocol) – Observe; may sometimes need to prompt for

thoughts and responses

• Final survey identifies general likes | dislikes– Colours, navigation, images, etc. of this site– Accuracy, completeness, consistency– Easy to understand, emotional involvement

Page 9: Research and Analysis Methods October 5, 2006. Surveys Electronic vs. Paper Surveys –Electronic: very efficient but requires users willing to take them;

Additional Methods

• Participatory Design– Include users in the (re)design of the website– Requires an organizational commitment to

actually listen to their use (problem sometimes w mgmt vs. labour situations)

• Paper prototyping– Use paper rather than online prototypes

because they are quick, flexible, easy to change, not too finished

– Tangible (touch) methods often elicit more emotional/ subjective info from users

Page 10: Research and Analysis Methods October 5, 2006. Surveys Electronic vs. Paper Surveys –Electronic: very efficient but requires users willing to take them;

Card Sorts• Many different versions

– See Lazar

• One use: decide what to put in | take out of a website (content analysis)– Identify content; put one item on one index card– Ask people to sort cards in 5 piles (must have, nice to

have, neutral, little use, would never use)

– Size of piles: specific size (forced choice) or free choice; forced choice requires people to evaluate/ make decisions

• Fun for users; if you have many decks of cards, can test lots of people quickly

Page 11: Research and Analysis Methods October 5, 2006. Surveys Electronic vs. Paper Surveys –Electronic: very efficient but requires users willing to take them;

Error Analysis (Critical Incidents)

• Useful to find what does not work

• Error logs, messages to the webmaster, phone calls for help, etc.

• Sites vary in terms of importance of errors:– What happens if someone doesn’t find info on

the computer science dept website?– What happens if someone can’t find info on

emergency contraception?

• May need to classify some user scenarios/ tasks as critical (must be able to complete successfully)

Page 12: Research and Analysis Methods October 5, 2006. Surveys Electronic vs. Paper Surveys –Electronic: very efficient but requires users willing to take them;

Heuristic (Expert) Evaluation• Many problems can be found by educated

usability professionals– Design, navigation, site hierarchy (too deep),

performance, etc.– Can’t identify subjective likes | dislikes, etc.

• Usability principles (from Lazar) and design principles (from Williams’ Non-designers Guides) can be applied to improve sites

• One method for class project can be your group’s own heuristic evaluation of problems– Just need to be able to explain/ classify these– Why do you decide something is a problem…

Page 13: Research and Analysis Methods October 5, 2006. Surveys Electronic vs. Paper Surveys –Electronic: very efficient but requires users willing to take them;

More Things to Consider• Population vs. Sample

– Sample needs to represent the population

• Convenience vs. Random samples– Need to identify potential for bias in your

sampling practices • do students in the ASU represent all Acadia students? • do students in the Wong Centre represent all groups

the Wong wants to attract?

• How many people to survey or test– Surveys: 30+– Usability tests: 5 (Nielsen) – 20 (statistical

validity) 8-9

Page 14: Research and Analysis Methods October 5, 2006. Surveys Electronic vs. Paper Surveys –Electronic: very efficient but requires users willing to take them;

Types of Information• Demographic info

– Need to gather info about people in your surveys, interviews, focus groups, observations, usability tests to be sure they match the target users

• Content Questions– Create consistency by developing a schedule (set) of

questions before you start your research– Watch out for leading questions (imply the answer you

want to hear)• We worked hard on Welcome Week; how successful was it?• You had the opportunity to attend Wel Wk; how successful was it?

– Probes (encourage responses – and why do you believe that – but don’t add info

Page 15: Research and Analysis Methods October 5, 2006. Surveys Electronic vs. Paper Surveys –Electronic: very efficient but requires users willing to take them;

Types of Scales• Open questions: free answer• Closed questions: fixed set of answers (e.g.,

multiple choice)• Semantic differential: (good for emotion and

subjectivity)– Warm 1 2 3 4 5

Cold– Exciting 1 2 3 4 5 Boring

• Major distinction:– Qualitative Research (open questions, free

observation, unstructured inquiry) vs.– Quantitative Research (closed questions, able to

apply statistical analysis

Page 16: Research and Analysis Methods October 5, 2006. Surveys Electronic vs. Paper Surveys –Electronic: very efficient but requires users willing to take them;

Final Thoughts• Anonymity vs. confidentiality: when you do

research with human subjects, you need to protect them from harm– Anonymous responses – their identities are

protected– Confidential responses – the information itself is

not revealed except in statistical averages, etc.

• Reliability vs. validity– Research can be reliable (always gets the same

kind of data) but not be valid (the data does not reflect the target population)

– Rare to be valid but not reliable!