YouTube video recommendations lead to more extremist content for right-leaning users, researchers suggest
YouTube’s video recommendation algorithm can potentially lead right-leaning users down a rabbit hole of extremist political content, according to a recent study by researchers at the University of California, Davis. The study, titled Auditing YouTube’s recommendation system for ideologically congenial, extreme, and problematic recommendations, found that YouTube tends to recommend videos that align with a user’s ideological leaning, with right-leaning users receiving recommendations from channels that promote extremism, conspiracy theories, and other problematic content.
The research team conducted a systematic audit of YouTube’s video recommendations in 2021 and 2022, using 100,000 automated accounts known as sock puppets to test the platform’s recommendations. Each of these sock puppets watched 100 videos from their assigned ideology, and their homepage recommendations were collected. The study revealed that right-leaning users were more likely to receive recommendations from channels sharing political extremism and conspiracy theories, while left-leaning users received significantly fewer recommendations.
The impact of following these recommendations can be significant, as they can activate a cycle of exposure to more problematic content without requiring any additional input from users. The study identified problematic channels as those promoting extremist ideas, such as white nationalism, the alt-right, and QAnon. Surprisingly, more than 36% of the sock puppets categorized as right-leaning received video recommendations from these problematic channels, compared to 32% for centrist and left-leaning users.
The prevalence of problematic recommendations on YouTube is concerning, especially considering the platform’s popularity and influence. YouTube is the most popular video-sharing platform, with approximately 81% of the U.S. population using it and a constantly growing user base. Additionally, over 70% of the content watched on YouTube is recommended by its algorithm.
The researchers emphasized the need for greater transparency in social recommendation systems like YouTube to prevent filter bubbles and the potential for radicalization. It’s important that we identify the specific factors that increase the likelihood that users encounter extreme or problematic content, said Magdalena Wojcieszak, a professor of communication in the College of Letters and Science at the University of California, Davis.
The study’s findings shed light on the potential dangers of YouTube’s recommendation algorithm, particularly for right-leaning users who may be exposed to extremist content. As YouTube continues to play a significant role in shaping public opinion and providing information, it is crucial to address the issues surrounding its recommendation system. By doing so, users can have a safer and more balanced experience on the platform.