Damning probes find Instagram is key link connecting pedophile rings
Enlarge (credit: NurPhoto / Contributor | NurPhoto)
Instagram has emerged as the most important platform for buyers and sellers of underage sex content, according to investigations from The Wall Street Journal, Stanford Internet Observatory, and the University of Massachusetts Amherst (UMass) Rescue Lab.
While other platforms play a role in processing payments and delivering content, Instagram is where hundreds of thousands-and perhaps millions-of users search explicit hashtags to uncover illegal "menus" of content that can then be commissioned. Content on offer includes disturbing imagery of children self-harming, "incest toddlers," and minors performing sex acts with animals, as well as opportunities for buyers to arrange illicit meetups with children, the Journal reported.
Because the child sexual abuse material (CSAM) itself is not hosted on Instagram, platform-owner Meta has a harder time detecting and removing these users. Researchers found that even when Meta's trust and safety team does ban users, their efforts are "directly undercut" by Instagram's recommendation system-which allows the networks to quickly reassemble under "backup" accounts that are usually listed in the bios of original accounts for just that purpose of surviving bans.