Workers’ level of skill on online labor platforms varies greatly. To manage quality, requesters often decompose complex jobs into simple, repetitive, micro-tasks that pay in average $0.10 per task. Within the current system, workers waste a significant amount of time on finding micro-tasks while requesters still criticize the performance, struggle with decomposing complex work into micro-tasks, allocating these tasks appropriately and reintegrating the outcome to a final result. This paper explores how we can apply scaffolding approaches — examples, rubrics, task rationale, and step-by-step instructions — to improve work quality and enable workers to perform more complex tasks. In a between-subjects study, novice workers from a micro-task market performed tasks (writing product reviews and designing a slide deck) selected from a professional contractor market. Participants received either the original or a scaffolded task description. Blind-to-condition experts judged the performance of submissions. We found that scaffolding the crowd led workers to perform significantly better than workers without scaffolding. Moreover, a follow-up analysis shows that scaffolding micro-task workers results in work that is on par with workers from the high-pay contractor platform. Concluding, it is being discussed if, how and by who scaffolding in a crowdsourcing environment can and should be implemented.
Identifer | oai:union.ndltd.org:UPSALLA1/oai:DiVA.org:liu-143419 |
Date | January 2017 |
Creators | Streuer, Monika |
Publisher | Linköpings universitet, Institutionen för ekonomisk och industriell utveckling, Karlsruhe Institure of Technology |
Source Sets | DiVA Archive at Upsalla University |
Language | English |
Detected Language | English |
Type | Student thesis, info:eu-repo/semantics/bachelorThesis, text |
Format | application/pdf |
Rights | info:eu-repo/semantics/openAccess |
Page generated in 0.0016 seconds