YouTube, The Great Radicalizer? Auditing and Mitigating Ideological Biases in YouTube Recommendations

03/20/2022
by   Muhammad Haroon, et al.
0

Recommendations algorithms of social media platforms are often criticized for placing users in "rabbit holes" of (increasingly) ideologically biased content. Despite these concerns, prior evidence on this algorithmic radicalization is inconsistent. Furthermore, prior work lacks systematic interventions that reduce the potential ideological bias in recommendation algorithms. We conduct a systematic audit of YouTube's recommendation system using a hundred thousand sock puppets to determine the presence of ideological bias (i.e., are recommendations aligned with users' ideology), its magnitude (i.e., are users recommended an increasing number of videos aligned with their ideology), and radicalization (i.e., are the recommendations progressively more extreme). Furthermore, we design and evaluate a bottom-up intervention to minimize ideological bias in recommendations without relying on cooperation from YouTube. We find that YouTube's recommendations do direct users – especially right-leaning users – to ideologically biased and increasingly radical content on both homepages and in up-next recommendations. Our intervention effectively mitigates the observed bias, leading to more recommendations to ideologically neutral, diverse, and dissimilar content, yet debiasing is especially challenging for right-leaning users. Our systematic assessment shows that while YouTube recommendations lead to ideological bias, such bias can be mitigated through our intervention.

READ FULL TEXT
research
08/22/2019

Auditing Radicalization Pathways on YouTube

Non-profits and the media claim there is a radicalization pipeline on Yo...
research
08/07/2020

Middle-Aged Video Consumers' Beliefs About Algorithmic Recommendations on YouTube

User beliefs about algorithmic systems are constantly co-produced throug...
research
11/25/2020

Evaluating the scale, growth, and origins of right-wing echo chambers on YouTube

Although it is understudied relative to other social media platforms, Yo...
research
01/15/2020

Tubes Bubbles – Topological confinement of YouTube recommendations

The role of recommendation algorithms in online user confinement is at t...
research
01/30/2021

When the Umpire is also a Player: Bias in Private Label Product Recommendations on E-commerce Marketplaces

Algorithmic recommendations mediate interactions between millions of cus...
research
02/15/2023

Assessing enactment of content regulation policies: A post hoc crowd-sourced audit of election misinformation on YouTube

With the 2022 US midterm elections approaching, conspiratorial claims ab...
research
04/22/2022

Subscriptions and external links help drive resentful users to alternative and extremist YouTube videos

Do online platforms facilitate the consumption of potentially harmful co...

Please sign up or login with your details

Forgot password? Click here to reset