Working in Pairs: Understanding the Effects of Worker Interactions in Crowdwork

10/23/2018
by   Chien-Ju Ho, et al.
0

Crowdsourcing has gained popularity as a tool to harness human brain power to help solve problems that are difficult for computers. Previous work in crowdsourcing often assumes that workers complete crowdwork independently. In this paper, we relax the independent property of crowdwork and explore how introducing direct, synchronous, and free-style interactions between workers would affect crowdwork. In particular, motivated by the concept of peer instruction in educational settings, we study the effects of peer communication in crowdsourcing environments. In the crowdsourcing setting with peer communication, pairs of workers are asked to complete the same task together by first generating their initial answers to the task independently and then freely discussing the tasks with each other and updating their answers after the discussion. We experimentally examine the effects of peer communication in crowdwork on various common types of tasks on crowdsourcing platforms, including image labeling, optical character recognition (OCR), audio transcription, and nutrition analysis. Our experiment results show that the work quality is significantly improved in tasks with peer communication compared to tasks where workers complete the work independently. However, participating in tasks with peer communication has limited effects on influencing worker's independent performance in tasks of the same type in the future.

READ FULL TEXT

page 9

page 16

research
04/26/2022

Treating Crowdsourcing as Examination: How to Score Tasks and Online Workers?

Crowdsourcing is an online outsourcing mode which can solve the current ...
research
12/13/2020

Comparing Generic and Community-Situated Crowdsourcing for Data Validation in the Context of Recovery from Substance Use Disorders

Targeting the right group of workers for crowdsourcing often achieves be...
research
02/08/2023

AVeCQ: Anonymous Verifiable Crowdsourcing with Worker Qualities

In crowdsourcing systems, requesters publish tasks, and interested worke...
research
07/25/2022

DialCrowd 2.0: A Quality-Focused Dialog System Crowdsourcing Toolkit

Dialog system developers need high-quality data to train, fine-tune and ...
research
04/16/2018

Deep Bayesian Trust : A Dominant Strategy and Fair Reward Mechanism for Crowdsourcing

A common mechanism to assess trust in crowdworkers is to have them answe...
research
11/09/2021

A Survey of NLP-Related Crowdsourcing HITs: what works and what does not

Crowdsourcing requesters on Amazon Mechanical Turk (AMT) have raised que...
research
02/14/2016

Embracing Error to Enable Rapid Crowdsourcing

Microtask crowdsourcing has enabled dataset advances in social science a...

Please sign up or login with your details

Forgot password? Click here to reset