Facebook moderators‘err on the side of an adult’ when uncertain of age in possible abuse photos
by Jay Peters from The Verge - All Posts on (#5XQN9)
Illustration by Alex Castro / The Verge
A major responsibility for tech companies is to monitor content on their platforms for child sexual abuse material (CSAM), and if any is found, they are legally required to report it to the National Center for Missing and Exploited Children (NCMEC). Many companies have content moderators in place that review content flagged for potentially being CSAM, and they determine whether the content should be reported to the NCMEC.
However, Facebook has a policy that could mean it is underreporting child sexual abuse content, according to a new report from The New York Times. A Facebook training document directs content moderators to err on the side of an adult" when they don't know someone's age in a photo or video that's suspected to be CSAM,...