Classic ways of gathering data on human behavior, such as laboratory based user studies, can be time-consuming, costly and are subject to limited participant pools. Crowdsourcing offers a reduction in operating costs and access to a diverse and large participant pool, however issues arise concerning low worker pay and questions about data quality. Gamification provides a motivation to participate, but also requires the development of specialized, research-question specific games that can be costly to produce. We provide another alternative that combines gamification and crowdsourcing in a smartphone-based system that emulates the popular Freemium model of micro-transactions to motivate voluntary participation through in-game rewards, using a robust framework to study multiple unrelated research questions within the same system. We deployed our prototype framework on the Android market and gathered data over a period of 5 weeks. We compared this data to that gathered from a gamified laboratory version and a non-gamified laboratory version, and found that players who use the in-game rewards were motivated to do experimental tasks. The data showed that there was no difference between the groups for performance on a motor task; however, performance on a cognitive task was worse for the crowdsourced Android group. We discuss the possible reasons for this and provide options for improving data collection and performance on tasks.
Identifer | oai:union.ndltd.org:USASK/oai:ecommons.usask.ca:10388/ETD-2015-07-2166 |
Date | 2015 July 1900 |
Contributors | Mandryk, Regan |
Source Sets | University of Saskatchewan Library |
Language | English |
Detected Language | English |
Type | text, thesis |
Page generated in 0.002 seconds