Return to search

Scaffolding Crowd Work : An Empirical Analysis of its Effects on Quality of Work

Workers’ level of skill on online labor platforms varies greatly. To manage quality, requesters often decompose complex jobs into simple, repetitive, micro-tasks that pay in average $0.10 per task. Within the current system, workers waste a significant amount of time on finding micro-tasks while requesters still criticize the performance, struggle with decomposing complex work into micro-tasks, allocating these tasks appropriately and reintegrating the outcome to a final result. This paper explores how we can apply scaffolding approaches — examples, rubrics, task rationale, and step-by-step instructions — to improve work quality and enable workers to perform more complex tasks. In a between-subjects study, novice workers from a micro-task market performed tasks (writing product reviews and designing a slide deck) selected from a professional contractor market. Participants received either the original or a scaffolded task description. Blind-to-condition experts judged the performance of submissions. We found that scaffolding the crowd led workers to perform significantly better than workers without scaffolding. Moreover, a follow-up analysis shows that scaffolding micro-task workers results in work that is on par with workers from the high-pay contractor platform. Concluding, it is being discussed if, how and by who scaffolding in a crowdsourcing environment can and should be implemented.

Identiferoai:union.ndltd.org:UPSALLA1/oai:DiVA.org:liu-143419
Date January 2017
CreatorsStreuer, Monika
PublisherLinköpings universitet, Institutionen för ekonomisk och industriell utveckling, Karlsruhe Institure of Technology
Source SetsDiVA Archive at Upsalla University
LanguageEnglish
Detected LanguageEnglish
TypeStudent thesis, info:eu-repo/semantics/bachelorThesis, text
Formatapplication/pdf
Rightsinfo:eu-repo/semantics/openAccess

Page generated in 0.0016 seconds