Mechanical Turk

25
Mechanical Turk Online Sampling with Crowdsourcing

description

Mechanical Turk. Online Sampling with Crowdsourcing. The Turk. Human Intelligence Tasks (HIT). Humans do some tasks better than machines Artificial Artificial I ntelligence Marketplace for HITs. Turk Workers & Requestors. “ Turkers ”. Requestors. 500,000 workers 190 countries - PowerPoint PPT Presentation

Transcript of Mechanical Turk

Page 1: Mechanical  Turk

Mechanical Turk

Online Sampling with Crowdsourcing

Page 2: Mechanical  Turk

The Turk

Page 3: Mechanical  Turk

Human Intelligence Tasks (HIT)

• Humans do some tasks better than machines

• Artificial Artificial Intelligence

• Marketplace for HITs

Page 4: Mechanical  Turk
Page 5: Mechanical  Turk

Turk Workers & Requestors

500,000 workers190 countries60% female* 83.5% white*

32.2 years old*14.9 years of education*

*Berinsky, A. J., Huber, G. A., & Lenz, G. S. (2012). Evaluating Online Labor Markets for Experimental Research: Amazon.com's Mechanical Turk. Political Analysis, 20,351–368.

“Turkers” Requestors

Page 6: Mechanical  Turk

Why Mturk?

• Low-cost paneling• Diverse sample• Convenience• Flexible• Easily managed

Page 7: Mechanical  Turk

Turk Samples• “The MTurk sample does not perfectly match the

demographic and attitudinal characteristics of the U.S. population but does not present a wildly distorted view of the U.S. population, either.”*

• Numerous social science experiments replicated on Mturk**• Slightly more demographically diverse than are standard

Internet samples***• Significantly more diverse than typical American college

samples***• At least as reliable as those obtained via traditional

methods****Berinsky, A. J., Huber, G. A., & Lenz, G. S. (2012). Evaluating Online Labor Markets for Experimental Research: Amazon.com's Mechanical Turk. Political Analysis, 20,351–368.**Mason, W. & Suri, S. (2011). Conducting behavioral research on Amazon’s Mechanical Turk. Behavioral Research Methods, 44(1), 1-23.***Buhrmester, M., Kwang, T., & Gosling, S. D. (2011). Amazon’s Mechanical Turk: A new source of inexpensive, yet high-quality data? Perspectives on Psychological Science, 6(1), 3-5.

Page 8: Mechanical  Turk

Turker Interface

Page 9: Mechanical  Turk

Requestor Interface

Page 10: Mechanical  Turk

Creating a HIT

Page 11: Mechanical  Turk

Creating a HIT

Page 12: Mechanical  Turk

Creating a HIT

Page 13: Mechanical  Turk

Creating a HIT

Page 14: Mechanical  Turk

Creating a HITMay request “master workers” of general/photo/category type

Location, approval rate, and “mastery” are the only selection criteria

Page 15: Mechanical  Turk

Creating a HIT

Page 16: Mechanical  Turk

Creating a HIT

Page 17: Mechanical  Turk

Creating a HIT

Page 18: Mechanical  Turk

Managing HITs

Page 19: Mechanical  Turk

Managing HITs

Page 20: Mechanical  Turk

Linking to Third Party Software (3PS)

• 3PS examples– Qualtrics, Survey Monkey, etc

• Why 3PS?– Between subjects designs– Random assignment– Time data– Diverse item types– Paging– No programming knowledge– Data exportation

• How to link 3PS and Mturk?

Page 21: Mechanical  Turk

Linking to 3PS

1. Create a survey in 3PS– The last question of the survey

should disclose an “approval code”2. Copy the URL of the survey into a

HIT– The HIT has one question: “what is

the approval code?”

Page 22: Mechanical  Turk

Linking to 3PS

Page 23: Mechanical  Turk

Pitfalls

• Participation in multiple groups• HITs completed slowly– Too low pay– HIT time distorted– Uninteresting description

• Sample bias– Time of day / week• SES, education, work, family

Page 24: Mechanical  Turk

Best Practices

• One survey, all conditions• Thoughtful description, tags• Estimate fair wage– General formula is (#items + # sentences + stimuli exposure

time)*2 = seconds to complete– Figure on minimum wage rate

• Limit HIT times– Completion time– Collection window

• Be consistent if doing multiple collections

• (Dis)Approve HITs within a day or two• Toss out multivariate outliers

Page 25: Mechanical  Turk