The Internet has enabled recruitment of large samples with specific characteristics. However, when researchers rely on participant self-report to determine eligibility, data quality depends on participant honesty. Across four studies on Amazon Mechanical Turk, we show that a substantial number of participants misrepresent theoretically relevant characteristics (e.g., demographics, product ownership) to meet eligibility criteria explicit in the studies, inferred by a previous exclusion from the study or inferred in previous experiences with similar studies. When recruiting rare populations, a large proportion of responses can be impostors. We provide recommendations about how to ensure that ineligible participants are excluded that are applicable to a wide variety of data collection efforts, which rely on self-report.

, , , , ,
doi.org/10.1177/1948550617698203, hdl.handle.net/1765/108256
ERIM Top-Core Articles
Social Psychological and Personality Science
Rotterdam School of Management (RSM), Erasmus University

Chandler, J., & Paolacci, G. (2017). Lie for a Dime: When Most Prescreening Responses Are Honest but Most Study Participants Are Impostors. Social Psychological and Personality Science, 8(5), 500–508. doi:10.1177/1948550617698203