The quality of data sources is one of the biggest concerns regarding (big) data analytics. For example, humans can deliberately lie or behave strategically when reporting their beliefs/opinions in surveys and opinion polls, which results in data sets of low reliability. The issue of honest reporting of subjective data can, in theory, be tackled by incentive-compatible methods such as proper scoring rules. We report a study conducted on the crowdsourcing platform Amazon Mechanical Turk that investigates the efficacy of proper scoring rules in inducing workers to honestly report forecasts. Our novel experimental design is able to detect several strategies employed by crowd workers other than truth-telling. Our experimental results, hence, cast a shadow on the usefulness of incentive-compatible methods, such as proper scoring rules, as a way of inducing honest reporting of subjective data and, thus, to ensure data quality.

, , ,
hdl.handle.net/1765/110847
24th Americas Conference on Information Systems 2018: Digital Disruption, AMCIS 2018
Rotterdam School of Management (RSM), Erasmus University

Carvalho, A., & Kroon, D. (2018). A study on the behavior of crowd workers when reporting forecasts under proper scoring rules. In Americas Conference on Information Systems 2018: Digital Disruption, AMCIS 2018. Retrieved from http://hdl.handle.net/1765/110847