📚

Understanding Crowdsourcing Requesters’ Wage Setting Behaviors

Authors

Kotaro Hara and Yudai Tanaka

Publication

CHI EA '22: CHI Conference on Human Factors in Computing Systems Extended Abstracts, April 2022, Article No.: 368, Pages 1–6, https://doi.org/10.1145/3491101.3519660

Abstract

Requesters on crowdsourcing platforms like Amazon Mechanical Turk (AMT) compensate workers inadequately. One potential reason for the underpayment is that the AMT’s requester interface provides limited information about estimated wages, preventing requesters from knowing if they are offering a fair piece-rate reward. To assess if presenting wage information affects requesters’ reward setting behaviors, we conducted a controlled study with 63 participants. We had three levels for a between-subjects factor in a mixed design study, where we provided participants with: no wage information, wage point estimate, and wage distribution. Each participant had three stages of adjusting the reward and controlling the estimated wage. Our analysis with Bayesian growth curve modeling suggests that the estimated wage derived from the participant-set reward increased from $2.56/h to $2.69/h and $2.33/h to $2.74/h when we provided point estimate and distribution information respectively. The wage decreased from $2.06/h to $1.99/h in the control condition.

Paper