Subscriptions Drive Views of Extremist Videos on YouTube
upstart writes:
Study shows viewership of harmful content concentrated among a small group of users:
As the second most popular social media platform in the world, YouTube frequently attracts criticism. In particular, critics argue that its algorithmic recommendations facilitate radicalization and extremism by sending users down "rabbit holes" of harmful content.
According to a new study published inScience Advances, however, exposure to alternative and extremist video channels on YouTube is not driven by recommendations. Instead, most consumption of these channels on the platform can be attributed to a small group of users high in gender and racial resentment and who subscribe to these channels and follow links to their videos.
The study authors caution that these findings do not exonerate the platform. "YouTube's algorithms may not be recommending alternative and extremist content to nonsubscribers very often, but they are nonetheless hosting it for free and funneling it to subscribers in ways that are of great concern," says co-authorBrendan Nyhan, the James O. Freedman Presidential Professor at Dartmouth.
[...] In 2019, YouTube announced that changes to its algorithms had reduced watch time of harmful content by 50%, with a 70% decline in watch time by nonsubscribers. These reports had not been independently verified, so the research team set out to determine who is watching this type of content and evaluate what recommendations are offered by YouTube's algorithm.
[...] Given the challenges of trying to characterize the content of every single video viewed, the researchers focused on the type of YouTube channels people watched. They compiled lists of channels that had been identified as alternative or extreme by journalists and academics and then examined how often a participant visited videos from those channels.
Read more of this story at SoylentNews.