Article 5XQN9 Facebook moderators‘err on the side of an adult’ when uncertain of age in possible abuse photos

Facebook moderators‘err on the side of an adult’ when uncertain of age in possible abuse photos

by
Jay Peters
from The Verge - All Posts on (#5XQN9)
acastro_180928_1777_facebook_hack_0001.0 Illustration by Alex Castro / The Verge

A major responsibility for tech companies is to monitor content on their platforms for child sexual abuse material (CSAM), and if any is found, they are legally required to report it to the National Center for Missing and Exploited Children (NCMEC). Many companies have content moderators in place that review content flagged for potentially being CSAM, and they determine whether the content should be reported to the NCMEC.

However, Facebook has a policy that could mean it is underreporting child sexual abuse content, according to a new report from The New York Times. A Facebook training document directs content moderators to err on the side of an adult" when they don't know someone's age in a photo or video that's suspected to be CSAM,...

Continue reading...

External Content
Source RSS or Atom Feed
Feed Location http://www.theverge.com/rss/index.xml
Feed Title The Verge - All Posts
Feed Link https://www.theverge.com/
Reply 0 comments