DeepAI AI Chat
Log In Sign Up

Informed Crowds Can Effectively Identify Misinformation

08/17/2021
by   Paul Resnick, et al.
0

Can crowd workers be trusted to judge whether news-like articles circulating on the Internet are wildly misleading, or does partisanship and inexperience get in the way? We assembled pools of both liberal and conservative crowd raters and tested three ways of asking them to make judgments about 374 articles. In a no research condition, they were just asked to view the article and then render a judgment. In an individual research condition, they were also asked to search for corroborating evidence and provide a link to the best evidence they found. In a collective research condition, they were not asked to search, but instead to look at links collected from workers in the individual research condition. The individual research condition reduced the partisanship of judgments. Moreover, the judgments of a panel of sixteen or more crowd workers were better than that of a panel of three expert journalists, as measured by alignment with a held out journalist's ratings. Without research, the crowd judgments were better than those of a single journalist, but not as good as the average of two journalists.

READ FULL TEXT
12/29/2022

Voices of Workers: Why a Worker-Centered Approach to Crowd Work Is Challenging

How can we better understand the broad, diverse, shifting, and invisible...
08/21/2020

Investigating Differences in Crowdsourced News Credibility Assessment: Raters, Tasks, and Expert Criteria

Misinformation about critical issues such as climate change and vaccine ...
12/30/2020

The Challenges of Crowd Workers in Rural and Urban America

Crowd work has the potential of helping the financial recovery of region...
02/13/2019

Crowd Work on a CV? Understanding How AMT Fits into Turkers' Career Goals and Professional Profiles

In 2013, scholars laid out a framework for a sustainable, ethical future...
01/20/2023

Who wants to cooperate-and why? Attitude and perception of crowd workers in online labor markets

Existing literature and studies predominantly focus on how crowdsource w...
05/05/2020

CODA-19: Reliably Annotating Research Aspects on 10,000+ CORD-19 Abstracts Using Non-Expert Crowd

This paper introduces CODA-19, a human-annotated dataset that denotes th...
06/03/2018

Mix and Match: Collaborative Expert-Crowd Judging for Building Test Collections Accurately and Affordably

Crowdsourcing offers an affordable and scalable means to collect relevan...