Autocompletion interfaces make crowd workers slower, but their use promotes response diversity

07/21/2017
by   Xipei Liu, et al.
0

Creative tasks such as ideation or question proposal are powerful applications of crowdsourcing, yet the quantity of workers available for addressing practical problems is often insufficient. To enable scalable crowdsourcing thus requires gaining all possible efficiency and information from available workers. One option for text-focused tasks is to allow assistive technology, such as an autocompletion user interface (AUI), to help workers input text responses. But support for the efficacy of AUIs is mixed. Here we designed and conducted a randomized experiment where workers were asked to provide short text responses to given questions. Our experimental goal was to determine if an AUI helps workers respond more quickly and with improved consistency by mitigating typos and misspellings. Surprisingly, we found that neither occurred: workers assigned to the AUI treatment were slower than those assigned to the non-AUI control and their responses were more diverse, not less, than those of the control. Both the lexical and semantic diversities of responses were higher, with the latter measured using word2vec. A crowdsourcer interested in worker speed may want to avoid using an AUI, but using an AUI to boost response diversity may be valuable to crowdsourcers interested in receiving as much novel information from workers as possible.

READ FULL TEXT

page 8

page 9

research
12/29/2022

Voices of Workers: Why a Worker-Centered Approach to Crowd Work Is Challenging

How can we better understand the broad, diverse, shifting, and invisible...
research
10/05/2016

Universal Clustering via Crowdsourcing

Consider unsupervised clustering of objects drawn from a discrete set, t...
research
11/06/2017

Sequential Multi-Class Labeling in Crowdsourcing

We consider a crowdsourcing platform where workers' responses to questio...
research
12/05/2018

A Technical Survey on Statistical Modelling and Design Methods for Crowdsourcing Quality Control

Online crowdsourcing provides a scalable and inexpensive means to collec...
research
10/26/2017

Optimal Crowdsourced Classification with a Reject Option in the Presence of Spammers

We explore the design of an effective crowdsourcing system for an M-ary ...
research
09/12/2017

Certified Computation in Crowdsourcing

A wide range of learning tasks require human input in labeling massive d...
research
02/25/2023

Mitigating Observation Biases in Crowdsourced Label Aggregation

Crowdsourcing has been widely used to efficiently obtain labeled dataset...

Please sign up or login with your details

Forgot password? Click here to reset