Treating Crowdsourcing as Examination: How to Score Tasks and Online Workers?

04/26/2022
by   Guangyang Han, et al.
0

Crowdsourcing is an online outsourcing mode which can solve the current machine learning algorithm's urge need for massive labeled data. Requester posts tasks on crowdsourcing platforms, which employ online workers over the Internet to complete tasks, then aggregate and return results to requester. How to model the interaction between different types of workers and tasks is a hot spot. In this paper, we try to model workers as four types based on their ability: expert, normal worker, sloppy worker and spammer, and divide tasks into hard, medium and easy task according to their difficulty. We believe that even experts struggle with difficult tasks while sloppy workers can get easy tasks right, and spammers always give out wrong answers deliberately. So, good examination tasks should have moderate degree of difficulty and discriminability to score workers more objectively. Thus, we first score workers' ability mainly on the medium difficult tasks, then reducing the weight of answers from sloppy workers and modifying the answers from spammers when inferring the tasks' ground truth. A probability graph model is adopted to simulate the task execution process, and an iterative method is adopted to calculate and update the ground truth, the ability of workers and the difficulty of the task successively. We verify the rightness and effectiveness of our algorithm both in simulated and real crowdsourcing scenes.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/12/2018

Distinguishing Question Subjectivity from Difficulty for Improved Crowdsourcing

The questions in a crowdsourcing task typically exhibit varying degrees ...
research
12/20/2020

Exploring Effectiveness of Inter-Microtask Qualification Tests in Crowdsourcing

Qualification tests in crowdsourcing are often used to pre-filter worker...
research
02/14/2023

A Provably Improved Algorithm for Crowdsourcing with Hard and Easy Tasks

Crowdsourcing is a popular method used to estimate ground-truth labels b...
research
10/23/2018

Working in Pairs: Understanding the Effects of Worker Interactions in Crowdwork

Crowdsourcing has gained popularity as a tool to harness human brain pow...
research
01/12/2021

Toward Effective Automated Content Analysis via Crowdsourcing

Many computer scientists use the aggregated answers of online workers to...
research
12/29/2022

Recovering Top-Two Answers and Confusion Probability in Multi-Choice Crowdsourcing

Crowdsourcing has emerged as an effective platform to label a large volu...
research
08/01/2018

How Does Tweet Difficulty Affect Labeling Performance of Annotators?

Crowdsourcing is a popular means to obtain labeled data at moderate cost...

Please sign up or login with your details

Forgot password? Click here to reset