-
YouTube Recommendations and Effects on Sharing Across Online Social Platforms
YouTube recently announced a decision to exclude potentially harmful con...
read it
-
Understanding the Incel Community on YouTube
YouTube is by far the largest host of user-generated video content world...
read it
-
Tubes Bubbles – Topological confinement of YouTube recommendations
The role of recommendation algorithms in online user confinement is at t...
read it
-
Producers of Popular Science Web Videos. Between New Professionalism and Old Gender Issues
This article provides an overview of the web video production context re...
read it
-
"You Know What to Do": Proactive Detection of YouTube Videos Targeted by Coordinated Hate Attacks
Over the years, the Web has shrunk the world, allowing individuals to sh...
read it
-
B-Script: Transcript-based B-roll Video Editing with Recommendations
In video production, inserting B-roll is a widely used technique to enri...
read it
-
Middle-Aged Video Consumers' Beliefs About Algorithmic Recommendations on YouTube
User beliefs about algorithmic systems are constantly co-produced throug...
read it
"It is just a flu": Assessing the Effect of Watch History on YouTube's Pseudoscientific Video Recommendations
YouTube has revolutionized the way people discover and consume videos, becoming one of the primary news sources for Internet users. Since content on YouTube is generated by its users, the platform is particularly vulnerable to misinformative and conspiratorial videos. Even worse, the role played by YouTube's recommendation algorithm in unwittingly promoting questionable content is not well understood, and could potentially make the problem even worse. This can have dire real-world consequences, especially when pseudoscientific content is promoted to users at critical times, e.g., during the COVID-19 pandemic. In this paper, we set out to characterize and detect pseudoscientific misinformation on YouTube. We collect 6.6K videos related to COVID-19, the flat earth theory, the anti-vaccination, and anti-mask movements; using crowdsourcing, we annotate them as pseudoscience, legitimate science, or irrelevant. We then train a deep learning classifier to detect pseudoscientific videos with an accuracy of 76.1 content on various parts of the platform (i.e., a user's homepage, recommended videos while watching a specific video, or search results) and how this exposure changes based on the user's watch history. We find that YouTube's recommendation algorithm is more aggressive in suggesting pseudoscientific content when users are searching for specific topics, while these recommendations are less common on a user's homepage or when actively watching pseudoscientific videos. Finally, we shed light on how a user's watch history substantially affects the type of recommended videos.
READ FULL TEXT
Comments
There are no comments yet.