Article 4YJRM Study of YouTube comments finds evidence of radicalization effect

Study of YouTube comments finds evidence of radicalization effect

by
Natasha Lomas
from Crunch Hype on (#4YJRM)

Research presented at the ACM FAT 2020 conference in Barcelona today supports the notion that YouTube's platform is playing a role in radicalizing users via exposure to far-right ideologies.

The study, carried out by researchers at Switzerland's Ecole polytechnique fi(C)di(C)rale de Lausanne and the Federal University of Minas Gerais in Brazil, found evidence that users who engaged with a middle ground of extreme right-wing content migrated to commenting on the most fringe far-right content.

A March 2018 New York Times article by sociologist Zeynep Tufekci set out the now widely reported thesis that YouTube is a radicalization engine. Followup reporting by journalist Kevin Roose told a compelling tale of the personal experience of an individual, Caleb Cain, who described falling down an "alt right rabbit hole" on YouTube. But researcher Manoel Horta Ribeiro, who was presenting the paper today, said the team wanted to see if they could find auditable evidence to support such anecdotes.

Their paper, called "Auditing radicalization pathways on YouTube," details a large-scale study of YouTube looking for traces of evidence - in likes, comments and views - that certain right-leaning YouTube communities are acting as gateways to fringe far-right ideologies.

Per the paper, they analyzed 330,925 videos posted on 349 channels - broadly classifying the videos into four types: Media, the Alt-lite, the Intellectual Dark Web (IDW) and the Alt-right - and using user comments as a "good enough" proxy for radicalization (their data set included 72 million comments).

The findings suggest a pipeline effect over a number of years where users who started out commenting on alt-lite/IDW YouTube content shifted to commenting on extreme far-right content on the platform over time.

The rate of overlap between consumers of Media content and the alt-right was found to be far lower.

"A significant amount of commenting users systematically migrates from commenting exclusively on milder content to commenting on more extreme content," they write in the paper. "We argue that this finding provides significant evidence that there has been, and there continues to be, user radicalization on YouTube, and our analyses of the activity of these communities" is consistent with the theory that more extreme content 'piggybacked' on the surge in popularity of I.D.W. and Alt-lite content" We show that this migration phenomenon is not only consistent throughout the years, but also that it is significant in its absolute quantity."

The researchers were unable to determine the exact mechanism involved in migrating YouTube users from consuming "alt lite" politics to engaging with the most fringe and extreme far-right ideologies - citing a couple of key challenges on that front: Limited access to recommendation data; and the study not taking into account personalization (which can affect a user's recommendations on YouTube).

But even without personalization, they say they were "still able to find a path in which users could find extreme content from large media channels."

During a conference Q&A after presenting the paper, Horta Ribeiro was asked what evidence they had that the radicalization effect the study identifies had occurred through YouTube, rather than via some external site - or because the people in question were more radicalized to begin with (and therefore more attracted to extreme ideologies) versus the notion of YouTube itself being an active radicalization pipeline.

He agreed it's difficult to make an absolute claim that YouTube is to blame. But also argued that, as host to these communities, the platform is responsible.

"We do find evident traces of user radicalization, and I guess the question asks why is YouTube responsible for this? And I guess the answer would be because many of these communities they live on YouTube and they have a lot of their content on YouTube and that's why YouTube is so deeply associated with it," he said.

"In a sense I do agree that it's very hard to make the claim that the radicalization is due to YouTube or due to some recommender system or that the platform is responsible for that. It could be that something else is leading to this radicalization and in that sense I think that the analysis that we make it shows there is this process of users going from milder channels to more extreme ones. And this solid evidence towards radicalization because people that were not exposed to this radical content become exposed. But it's hard to make strong causal claims - like YouTube is responsible for that."

We reached out to YouTube for a response to the research but the company did not reply to our questions.

The company has tightened its approach toward certain far-right and extremist content in recent years, in the face of growing political and public pressure over hate speech, targeted harassment and radicalization risks.

It has also been experimenting with reducing algorithmic amplification of certain types of potentially damaging nonsense content that falls outside its general content guidelines - such as malicious conspiracy theories and junk science.

Techcrunch?d=2mJPEYqXBVI Techcrunch?d=7Q72WNTAKBA Techcrunch?d=yIl2AUoC8zA Techcrunch?i=6vewa0NpMwI:R_hL_jAqQaI:-BT Techcrunch?i=6vewa0NpMwI:R_hL_jAqQaI:D7D Techcrunch?d=qj6IDK7rITs6vewa0NpMwI
External Content
Source RSS or Atom Feed
Feed Location http://feeds.feedburner.com/TechCrunch/
Feed Title Crunch Hype
Feed Link https://techncruncher.blogspot.com/
Reply 0 comments