7. Amazon Mechanical Turk
A marketplace for HITs (Human Intelligent Tasks)
"Requesters" (employers) post tasks
"Workers" (on line users) pickup and complete
task
Payment (few cents for HIT)
8. Typical (frequently posted) Tasks
Transcription
Content generation
Content rewriting
Object classification
Content evaluation and feedback
Mediator for other requesters (!)
Panagiotis Ipeirotis (NYU) "Analyzing the Amazon Mechanical Turk Marketplace"
9. 90% of HITS < $0.10
Prices*
* Data collected from January 2009 till April 2010
Panagiotis Ipeirotis (NYU) "Analyzing the Amazon Mechanical Turk Marketplace"
10. Volume of transactions
90% of days < $2,000
median arrival rate: $1,040 per day
median completion rate: $1,155 per day
* Data collected from January 2009 till April 2010
Panagiotis Ipeirotis (NYU) "Analyzing the Amazon Mechanical Turk Marketplace"
11. Responsiveness and reward
Assignment: 100 HITs
Michael Franklin et al (Berkeley and ETH) CrowdDB: Answering Queries with Crowdsourcing
12. Not only AMT
RABJ (FreeBase)
2.3M human judgments
18 sec/judgment
Tipical Tasks: reconciliations, typeWriter,
geographers, genderizer
Shailesh Kochhar et al "The Anatomy of a large scale human computation engine"
13. Not only AMT
SamaSource
founder Leila Chirayath Janah (26)
social business (raised > 5M$ from Google,
Rockefeller, Ford Foundation et al)
>2,000 worker from dev. countries (India, Kenya)
15$ day per worker
To date paid out 1M$ in wages
14. Not only AMT
Startups!
Crowdflowers
(organize AMT HITS for enterprises)
uTest: sw testing
(functional, UX, usability)
MicroTask: doc processing
(feed paper docs into systems)
19. References
• Michael Franklin et al (Berkeley and ETH)
CrowdDB: Answering Queries with Crowdsourcing
• Shailesh Kochhar et al (Metaweb Tech)
"The Anatomy of a large scale human computation engine"
• Panagiotis Ipeirotis (NYU)
"Analyzing the Amazon Mechanical Turk Marketplace"
• Aditya Parameswaran et al (Stanford U)
"Answering Queries using Humans, Algorithms and Databases"
• http://amplab.cs.berkeley.edu/
• Adam Marcus et Al (MIT)
Crowdsourced Databases: Query Processing with People