Article 6GQ2V Instagram reportedly served up child-sexualizing reels to followers of teen influencers

Instagram reportedly served up child-sexualizing reels to followers of teen influencers

by
Richard Lai
from Engadget is a web magazine with obsessive daily coverage of everything new in gadgets and consumer electronics on (#6GQ2V)
Story Image

Following X's alleged ad controversy involving antisemitic content, it is now Meta's turn to be put under the spotlight for its content algorithm. According to an experiment conducted by The Wall Street Journal, Instagram's Reels video service would serve "risque footage of children as well as overtly sexual adult videos" to test accounts that exclusively followed teen and preteen influencers - namely young gymnasts and cheerleaders. These sort of ads were supposed to be forbidden on Meta's platforms.

To make matters worse, such salacious content was also mixed in with ads representing notable US brands like Disney, Walmart, Pizza Hut, Bumble, Match Group and even The Wall Street Journal itself. The report added that the Canadian Centre for Child Protection achieved similar results with its own tests separately.

While Walmart and Pizza Hut apparently declined to comment, Bumble, Match Group, Hims (retailer of erectile-dysfunction drugs) and Disney have since either pulled their ads from Meta or pressed the firm to address this issue. Given the earlier controversy on X, advertisers are obviously even more sensitive about the type of content shown next to their ads - especially for Disney which was affected by both X and now Instagram.

In response, Meta told its clients that it was investigating, and that it "would pay for brand-safety auditing services to determine how often a company's ads appear beside content it considers unacceptable." However, the firm stopped short at providing a timetable nor detail on future prevention.

While one could say that such tests don't necessarily represent real user experience (as tech companies tend to argue with), Instagram's tendency to aggregate child sexualization content was a known problem internally - even before the launch of Reels, according to current and former Meta employees interviewed by the WSJ.

The same group of people suggested that an effective solution would require revamping the algorithms responsible for pushing related content to users. That said, internal documents seen by the WSJ suggested that Meta made it difficult for its safety team to apply such drastic changes, as traffic performance is apparently more important for the social media giant.

This article originally appeared on Engadget at https://www.engadget.com/instagram-reportedly-served-up-child-sexualizing-reels-to-followers-of-teen-influencers-053251960.html?src=rss
External Content
Source RSS or Atom Feed
Feed Location https://www.engadget.com/rss.xml
Feed Title Engadget is a web magazine with obsessive daily coverage of everything new in gadgets and consumer electronics
Feed Link https://www.engadget.com/
Feed Copyright copyright Yahoo 2024
Reply 0 comments