"It is just a flu": Assessing the Effect of Watch History on YouTube's Pseudoscientific Video Recommendations

10/22/2020
by   Kostantinos Papadamou, et al.
0

YouTube has revolutionized the way people discover and consume videos, becoming one of the primary news sources for Internet users. Since content on YouTube is generated by its users, the platform is particularly vulnerable to misinformative and conspiratorial videos. Even worse, the role played by YouTube's recommendation algorithm in unwittingly promoting questionable content is not well understood, and could potentially make the problem even worse. This can have dire real-world consequences, especially when pseudoscientific content is promoted to users at critical times, e.g., during the COVID-19 pandemic. In this paper, we set out to characterize and detect pseudoscientific misinformation on YouTube. We collect 6.6K videos related to COVID-19, the flat earth theory, the anti-vaccination, and anti-mask movements; using crowdsourcing, we annotate them as pseudoscience, legitimate science, or irrelevant. We then train a deep learning classifier to detect pseudoscientific videos with an accuracy of 76.1 content on various parts of the platform (i.e., a user's homepage, recommended videos while watching a specific video, or search results) and how this exposure changes based on the user's watch history. We find that YouTube's recommendation algorithm is more aggressive in suggesting pseudoscientific content when users are searching for specific topics, while these recommendations are less common on a user's homepage or when actively watching pseudoscientific videos. Finally, we shed light on how a user's watch history substantially affects the type of recommended videos.

READ FULL TEXT

page 5

page 7

research
05/20/2021

Characterizing Abhorrent, Misinformative, and Mistargeted Content on YouTube

YouTube has revolutionized the way people discover and consume video. Al...
research
02/15/2023

Assessing enactment of content regulation policies: A post hoc crowd-sourced audit of election misinformation on YouTube

With the 2022 US midterm elections approaching, conspiratorial claims ab...
research
07/27/2023

How to Train Your YouTube Recommender to Avoid Unwanted Videos

YouTube provides features for users to indicate disinterest when present...
research
10/18/2022

Auditing YouTube's Recommendation Algorithm for Misinformation Filter Bubbles

In this paper, we present results of an auditing study performed over Yo...
research
06/30/2022

Deradicalizing YouTube: Characterization, Detection, and Personalization of Religiously Intolerant Arabic Videos

Growing evidence suggests that YouTube's recommendation algorithm plays ...
research
01/22/2020

Understanding the Incel Community on YouTube

YouTube is by far the largest host of user-generated video content world...
research
06/01/2023

Leveraging Natural Language Processing For Public Health Screening On YouTube: A COVID-19 Case Study

Background: Social media platforms have become a viable source of medica...

Please sign up or login with your details

Forgot password? Click here to reset