Article 5MXQP If YouTube’s algorithms radicalize people, it’s hard to tell from the data

If YouTube’s algorithms radicalize people, it’s hard to tell from the data

by
John Timmer
from Ars Technica - All content on (#5MXQP)
GettyImages-1230733999-800x534.jpg

Enlarge / YouTube's recommendation algorithm probably didn't send them to Washington, DC. (credit: Brent Stirton / Getty Images)

We've all seen it happen: Watch one video on YouTube and your recommendations shift, as if Google's algorithms think the video's subject is your life's passion. Suddenly, all the recommended videos-and probably many ads-you're presented with are on the topic.

Mostly, the results are comical. But there has been a steady stream of stories about how the process has radicalized people, sending them down an ever-deepening rabbit hole until all their viewing is dominated by fringe ideas and conspiracy theories.

A new study released on Monday looks at whether these stories represent a larger trend or are just a collection of anecdotes. While the data can't rule out the existence of online radicalization, it definitely suggests that it's not the most common experience. Instead, it seems like fringe ideas are simply part of a larger self-reinforcing community.

Read 15 remaining paragraphs | Comments

index?i=JozPetHs-L0:t1Jq-Y8fxOM:V_sGLiPB index?i=JozPetHs-L0:t1Jq-Y8fxOM:F7zBnMyn index?d=qj6IDK7rITs index?d=yIl2AUoC8zA
External Content
Source RSS or Atom Feed
Feed Location http://feeds.arstechnica.com/arstechnica/index
Feed Title Ars Technica - All content
Feed Link https://arstechnica.com/
Reply 0 comments