Disinformation, Stochastic Harm, and Costly Filtering: A Principal-Agent Analysis of Regulating Social Media Platforms

06/17/2021
by   Shehroze Khan, et al.
0

The spread of disinformation on social media platforms such as Facebook is harmful to society. This harm can take the form of a gradual degradation of public discourse; but it can also take the form of sudden dramatic events such as the recent insurrection on Capitol Hill. The platforms themselves are in the best position to prevent the spread of disinformation, as they have the best access to relevant data and the expertise to use it. However, filtering disinformation is costly, not only for implementing filtering algorithms or employing manual filtering effort, but also because removing such highly viral content impacts user growth and thus potential advertising revenue. Since the costs of harmful content are borne by other entities, the platform will therefore have no incentive to filter at a socially-optimal level. This problem is similar to the problem of environmental regulation, in which the costs of adverse events are not directly borne by a firm, the mitigation effort of a firm is not observable, and the causal link between a harmful consequence and a specific failure is difficult to prove. In the environmental regulation domain, one solution to this issue is to perform costly monitoring to ensure that the firm takes adequate precautions according a specified rule. However, classifying disinformation is performative, and thus a fixed rule becomes less effective over time. Encoding our domain as a Markov decision process, we demonstrate that no penalty based on a static rule, no matter how large, can incentivize adequate filtering by the platform. Penalties based on an adaptive rule can incentivize optimal effort, but counterintuitively, only if the regulator sufficiently overreacts to harmful events by requiring a greater-than-optimal level of filtering.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/20/2023

A User-Driven Framework for Regulating and Auditing Social Media

People form judgments and make decisions based on the information that t...
research
06/21/2023

Misinformation as Information Pollution

Social media feed algorithms are designed to optimize online social enga...
research
03/12/2020

Social Media and Misleading Information in a Democracy: A Mechanism Design Approach

In this paper, we present a resource allocation mechanism for the study ...
research
11/25/2022

Sponsored messaging about climate change on Facebook: Actors, content, frames

Online communication about climate change is central to public discourse...
research
12/17/2020

The COVID-19 Infodemic: Twitter versus Facebook

The global spread of the novel coronavirus is affected by the spread of ...
research
11/17/2021

Contracts with Private Cost per Unit-of-Effort

Economic theory distinguishes between principal-agent settings in which ...

Please sign up or login with your details

Forgot password? Click here to reset