Boomerang: Rebounding the Consequences of Reputation Feedback on Crowdsourcing Platforms

04/14/2019
by   Snehalkumar, et al.
0

Paid crowdsourcing platforms suffer from low-quality work and unfair rejections, but paradoxically, most workers and requesters have high reputation scores. These inflated scores, which make high-quality work and workers difficult to find, stem from social pressure to avoid giving negative feedback. We introduce Boomerang, a reputation system for crowdsourcing that elicits more accurate feedback by rebounding the consequences of feedback directly back onto the person who gave it. With Boomerang, requesters find that their highly-rated workers gain earliest access to their future tasks, and workers find tasks from their highly-rated requesters at the top of their task feed. Field experiments verify that Boomerang causes both workers and requesters to provide feedback that is more closely aligned with their private opinions. Inspired by a game-theoretic notion of incentive-compatibility, Boomerang opens opportunities for interaction design to incentivize honest reporting over strategic dishonesty.

READ FULL TEXT

page 1

page 2

page 8

research
03/26/2020

A Misreport- and Collusion-Proof Crowdsourcing Mechanism without Quality Verification

Quality control plays a critical role in crowdsourcing. The state-of-the...
research
01/19/2020

Creativity on Paid Crowdsourcing Platforms

General-purpose crowdsourcing platforms are increasingly being harnessed...
research
04/08/2021

Strategic Information Revelation in Crowdsourcing Systems Without Verification

We study a crowdsourcing problem where the platform aims to incentivize ...
research
01/10/2018

Eliciting Worker Preference for Task Completion

Current crowdsourcing platforms provide little support for worker feedba...
research
02/10/2016

Feature Based Task Recommendation in Crowdsourcing with Implicit Observations

Existing research in crowdsourcing has investigated how to recommend tas...
research
09/12/2017

Certified Computation in Crowdsourcing

A wide range of learning tasks require human input in labeling massive d...
research
07/25/2022

DialCrowd 2.0: A Quality-Focused Dialog System Crowdsourcing Toolkit

Dialog system developers need high-quality data to train, fine-tune and ...

Please sign up or login with your details

Forgot password? Click here to reset