Crowdsourcing consumer research
Data collection in consumer research has progressively moved away from traditional samples (e.g., university undergraduates) and toward Internet samples. In the last complete volume of the Journal of Consumer Research (June 2015-April 2016), 43% of behavioral studies were conducted on the crowdsourcing website Amazon Mechanical Turk (MTurk). The option to crowdsource empirical investigations has great efficiency benefits for both individual researchers and the field, but it also poses new challenges and questions for how research should be designed, conducted, analyzed, and evaluated. We assess the evidence on the reliability of crowdsourced populations and the conditions under which crowdsourcing is a valid strategy for data collection. Based on this evidence, we propose specific guidelines for researchers to conduct high-quality research via crowdsourcing. We hope this tutorial will strengthen the community's scrutiny on data collection practices and move the field toward better and more valid crowdsourcing of consumer research.
|Keywords||Crowdsourcing, Data collection, Mechanical Turk, MTurk, Sampling|
|Persistent URL||dx.doi.org/10.1093/jcr/ucx047, hdl.handle.net/1765/100286|
|Series||ERIM Top-Core Articles|
|Journal||Journal of Consumer Research|
Goodman, J.K, & Paolacci, G. (2017). Crowdsourcing consumer research. Journal of Consumer Research, 44(1), 196–210. doi:10.1093/jcr/ucx047