YouTube’s extremist rabbit holes are deep but narrow

New research indicates most users don't see hateful YouTube content—but the site can further strengthen hateful echo chambers.
YouTube home screen on smartphone
YouTube retooled its recommendation algorithms in 2019, but researchers say more work is needed. Deposit Photos

Share

In the months following the 2016 presidential election, mounting evidence suggested YouTube’s video recommendations could send viewers down rabbit holes of extremist and hateful content. The heated criticism culminated in YouTube issuing a public statement in early 2019 announcing changes to its content algorithms in an effort to reduce “recommendations of borderline content and content that could misinform users in harmful ways.” Since then, Youtube claimed its augmentations resulted in at least 50 percent less watch time for racist, sexist, and otherwise prejudiced content via users’ recommendation feeds.

Little empirical evidence is publicly available to back up the assertion, but new research confirms at least some positive improvements to YouTube’s algorithms. That said, experts caution such positives don’t negate the harm that radicalizing content continues to inflict on users and the public.

According to a study conducted by a team of researchers from City University of New York, Stanford, Dartmouth, Northeastern University, and University of Exeter, YouTube algorithmic recommendations do not necessarily funnel users through radicalizing rabbit holes, “possibly due to changes that the company made to its recommender system in 2019.”

[Related: OpenAI’s newest ChatGPT update can still spread conspiracy theories.]

The team’s findings, published August 30 in the journal Science Advances, utilized a US public opinion survey alongside voluntarily offered browsing data from 1,181 respondents between July and December 2020. 

Instead, exposure to extremist and antagonistic content was largely focused on a much smaller subset of already predisposed users. Still, the team argues the platform “continues to play a key role in facilitating exposure to content from alternative and extremist channels among dedicated audiences.” Not only that, but engagement with this content still results in advertising profits.

“[The] study confirms that platforms like YouTube can, and should, do much more to restrict the reach of extremist content to the dedicated audiences that seek it out,” Science Advances editor Aaron Shaw writes in an accompanying piece. “YouTube and its parent Alphabet should divest from revenue generating activities related to content that contradicts their public commitments to reduce the spread of hate speech, harassment, and harmful conspiracy theories.”

“In a time where social media platforms are backsliding in their efforts to curb hate and misinformation, YouTube should take this and other recent research as an encouraging sign and continue to invest in solutions to make it a safe platform for all consumers and society,” Millican continues.

Julie Millican, vice president of Media Matters, a nonprofit dedicated to monitoring far-right misinformation, believes YouTube deserves credit for improving its algorithms to better limit the proliferation of extremist content. “However, the fact remains that hateful, bigoted and conspiracy content remains rampant on the platform and too easily found for those seeking it,” she tells PopSci. “Even if only a smaller portion of users become radicalized by content on the platform, we have seen over and over that this radicalization can have deadly consequences.”

While continued work on YouTube’s recommendation system is vital and admirable, the study’s researchers echoed that, “even low levels of algorithmic amplification can have damaging consequences when extrapolated over YouTube’s vast user base and across time.” Approximately 247 million Americans regularly use the platform, according to recent reports. YouTube representatives did not respond to PopSci at the time of writing.

 

Win the Holidays with PopSci's Gift Guides

Shopping for, well, anyone? The PopSci team’s holiday gift recommendations mean you’ll never need to buy another last-minute gift card.