The YouTube Algorithm Isn’t Radicalizing People

A computer screen with red background highlighting YouTube
Photo Credit: YoGinta (Getty Images)

In recent years, there has been a popular narrative in the media that videos from highly partisan, conspiracy theory-driven YouTube channels radicalize young Americans and that YouTube’s recommendation algorithm leads users down a path of increasingly radical content.

However, a new study from the Computational Social Science Lab (CSSLab) at the University of Pennsylvania finds that users’ own political interests and preferences play the primary role in what they choose to watch. In fact, if the recommendation features have any impact on users’ media diets, it is a moderating one.

“On average, relying exclusively on the recommender results in less partisan consumption,” says lead author Homa Hosseinmardi, associate research scientist at the CSSLab.

YouTube Bots

To determine the true effect of YouTube’s recommendation algorithm on what users watch, the researchers created bots that either followed the recommendation engines or completely ignored them. To do this, the researchers created bots trained on the YouTube watch history from a set of 87,988 real life users collected from October 2021 to December 2022.

Guided by Penn Integrates Knowledge University Professor Duncan Watts, Hosseinmardi and co-authors Amir Ghasemian, Miguel Rivera-Lanas, Manoel Horta Ribeiro and Robert West aimed to untangle the complex relationship between user preferences and the recommendation algorithm, a relationship that evolves with each video watched.

These bots were assigned individualized YouTube accounts so that their viewing history could be tracked, and the partisanship of what they watched was estimated using the metadata associated with each video.

This story was written by Hailey Reissman and Delphine Gardiner. To read the full article, please visit ASC.

Share: