Article 6EHYV Yet Another Study Debunks The ‘YouTube’s Algorithm Drives People To Extremism’ Argument

Yet Another Study Debunks The ‘YouTube’s Algorithm Drives People To Extremism’ Argument

by
Mike Masnick
from Techdirt on (#6EHYV)
Story Image

A few weeks ago, we had director Alex Winter on the podcast to talk about his latest documentary, The YouTube Effect. In that film he spoke with a young man who talked about getting radicalized" on YouTube and going down the alt-right rabbit hole." One thing that Alex talked about in the podcast, but was not in the documentary, was that, at one point, he asked the guy to go to YouTube and see if it would take him down that path again, and he couldn't even get it to recommend sketchy videos no matter how hard he tried.

The story that's made the rounds over the years was that YouTube's algorithm was a radicalization machine." Indeed, that story has been at the heart of many recent attacks on recommendation algorithms from many different sites.

And yet, it's not clear that the story holds up. It is possible that it was true at one point, but even that I'd call into question. Two years ago we wrote about a detailed study looking at YouTube's recommendation algorithm from January 2016 through December of 2019, and try as they might, the researchers could find no evidence that the algorithm pushed people to more extreme content. As that study noted:

We find no evidence that engagement with far-right content is caused by YouTube recommendations systematically, nor do we find clear evidence that anti-woke channels serve as a gateway to the far right.

Anyway, the journal Science now has another study on this same topic that... more or less finds the same thing. This study was done in 2020 (so after the last study) and also finds little evidence of the algorithm driving people down rabbit holes of extremism.

Our findings suggest that YouTube's algorithms were not sending people down rabbit holes" during our observation window in 2020...

Indeed, this new research report cites the one we wrote about two years ago, saying that it replicated those findings, but also highlights that it did so during the election year of 2020.

We report two key findings. First, we replicate findings from Hosseinmardi etal. (20) concerning the overall size of the audience for alternative and extreme content and enhance their validity by examining participants' attitudinal variables. Although almost all participants use YouTube, videos from alternative and extremist channels are overwhelmingly watched by a small minority of participants with high levels of gender and racial resentment. Within this group, total viewership is heavily concentrated among a few individuals, a common finding among studies examining potentially harmful online content (27). Similar to prior work (20), we observe that viewers often reach these videos via external links (e.g., from other social media platforms). In addition, we find that viewers are often subscribers to the channels in question. These findings demonstrate the scientific contribution made by our study. They also highlight that YouTube remains a key hosting provider for alternative and extremist channels, helping them continue to profit from their audience (28, 29) and reinforcing concerns about lax content moderation on the platform (30).

Second, we investigate the prevalence of rabbit holes in YouTube's recommendations during the fall of 2020. We rarely observe recommendations to alternative or extremist channel videos being shown to, or followed by, nonsubscribers. During our study period, only 3% of participants who were not already subscribed to alternative or extremist channels viewed a video from one of these channels based on a recommendation. On one hand, this finding suggests that unsolicited exposure to potentially harmful content on YouTube in the post-2019 era is rare, in line with findings from prior work (24, 25).

What's a little odd, though, is that this new study keeps suggesting that this was a result of changes YouTube made to the algorithm in 2019, and even suggests that the possible reason for this finding was that YouTube had already radicalized all the people open to being radicalized.

But... that ignores that the other study that they directly cite found the same thing starting in 2016.

It's also a little strange in that this new study seems to want to find something to be mad at YouTube about, and seems to focus on the fact that even though the algorithm isn't driving new users to extremist content, that content is still on YouTube, and external links (often from nonsense peddlers) drive new traffic to it:

Our data indicate that many alternative and extremist channels remain on the platform and attract a small but active audience of individuals who expressed high levels of hostile sexism and racial resentment in survey data collected in 2018. These participants frequently subscribe to the channels in question, generating more frequent recommendations. By continuing to host these channels, YouTube facilitates the growth of problematic communities (many channel views originate in referrals from alternative social media platforms where users with high levels of gender and racial resentment may congregate)

Except, if that's the case, then it doesn't really matter what YouTube does here. Because even if YouTube took down that content, those content providers would post it elsewhere (hello Rumble!) and the same nonsense peddlers would just point there. So, it's unclear what the YouTube problem is in this study.

Hell, the study even finds that in the rare cases where the recommendation algorithm does suggest some nonsense peddler to a non-jackass, that most people know better than to click:

We also observe that people rarely follow recommendations to videos from alternative and extreme channels when they are watching videos from mainstream news and non-news channels.

Either way, I do think it's fairly clear that the story you've heard about YouTube radicalizing people not only isn't true today, but if it was ever true, it hasn't been so in a long, long, time.

The issue is not recommendations. It's not social media. It's that there is a subset of the population who seem primed and ready to embrace ridiculous, cult-like support for a bunch of grifting nonsense peddlers. I'm not sure how society fixes that, but YouTube isn't magically going to fix it either.

External Content
Source RSS or Atom Feed
Feed Location https://www.techdirt.com/techdirt_rss.xml
Feed Title Techdirt
Feed Link https://www.techdirt.com/
Reply 0 comments