Article 6NZ8K Millions of OnlyFans paywalls make it hard to detect child sex abuse, cops say

Millions of OnlyFans paywalls make it hard to detect child sex abuse, cops say

by
Ashley Belanger
from Ars Technica - All content on (#6NZ8K)
GettyImages-2079451031-800x540.jpg

Enlarge (credit: SOPA Images / Contributor | LightRocket)

OnlyFans' paywalls make it hard for police to detect child sexual abuse materials (CSAM) on the platform, Reuters reported-especially new CSAM that can be harder to uncover online.

Because each OnlyFans creator posts their content behind their own paywall, five specialists in online child sexual abuse told Reuters that it's hard to independently verify just how much CSAM is posted. Cops would seemingly need to subscribe to each account to monitor the entire platform, one expert who aids in police CSAM investigations, Trey Amick, suggested to Reuters.

OnlyFans claims that the amount of CSAM on its platform is extremely low. Out of 3.2 million accounts sharing "hundreds of millions of posts," OnlyFans only removed 347 posts as suspected CSAM in 2023. Each post was voluntarily reported to the CyberTipline of the National Center for Missing and Exploited Children (NCMEC), which OnlyFans told Reuters has "full access" to monitor content on the platform.

Read 15 remaining paragraphs | Comments

External Content
Source RSS or Atom Feed
Feed Location http://feeds.arstechnica.com/arstechnica/index
Feed Title Ars Technica - All content
Feed Link https://arstechnica.com/
Reply 0 comments