What is the Will of the People? Moderation Preferences for Misinformation

02/01/2022
by   Shubham Atreja, et al.
0

To reduce the spread of misinformation, social media platforms may take enforcement actions against offending content, such as adding informational warning labels, reducing distribution, or removing content entirely. However, both their actions and their inactions have been controversial and plagued by allegations of partisan bias. The controversy in part can be explained by a lack of clarity around what actions should be taken, as they may not neatly reduce to questions of factual accuracy. When decisions are contested, the legitimacy of decision-making processes becomes crucial to public acceptance. Platforms have tried to legitimize their decisions by following well-defined procedures through rules and codebooks. In this paper, we consider an alternate source of legitimacy – the will of the people. Surprisingly little is known about what ordinary people want the platforms to do about specific content. We provide empirical evidence about lay raters' preferences for platform actions on 368 news articles. Our results confirm that on many items there is no clear consensus on which actions to take. There is no partisan difference in terms of how many items deserve platform actions but liberals do prefer somewhat more action on content from conservative sources, and vice versa. We find a clear hierarchy of perceived severity, with inform being the least severe action, followed by reduce, and then remove. We also find that judgments about two holistic properties, misleadingness and harm, could serve as an effective proxy to determine what actions would be approved by a majority of raters. We conclude with the promise of the will of the people while acknowledging the practical details that would have to be worked out.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/08/2019

FAIRY: A Framework for Understanding Relationships between Users' Actions and their Social Feeds

Users increasingly rely on social media feeds for consuming daily inform...
research
05/11/2022

MEWS: Real-time Social Media Manipulation Detection and Analysis

This article presents a beta-version of MEWS (Misinformation Early Warni...
research
04/20/2023

A User-Driven Framework for Regulating and Auditing Social Media

People form judgments and make decisions based on the information that t...
research
02/13/2021

A Bayesian social platform for inclusive and evidence-based decision making

Against the backdrop of a social media reckoning, this paper seeks to de...
research
02/23/2022

The Challenge of Understanding What Users Want: Inconsistent Preferences and Engagement Optimization

Online platforms have a wealth of data, run countless experiments and us...
research
02/13/2022

Comparing the Perceived Legitimacy of Content Moderation Processes: Contractors, Algorithms, Expert Panels, and Digital Juries

While research continues to investigate and improve the accuracy, fairne...

Please sign up or login with your details

Forgot password? Click here to reset