Article 5NK8H We built a system like Apple’s to flag child sexual abuse material – and concluded the tech was dangerous

We built a system like Apple’s to flag child sexual abuse material – and concluded the tech was dangerous

by
Thom Holwerda
from OSnews on (#5NK8H)

Earlier this month, Apple unveiled a system that would scan iPhone and iPad photos for child sexual abuse material (CSAM). The announcement sparked a civil liberties firestorm, and Apple's own employees have been expressing alarm. The company insists reservations about the system are rooted in misunderstandings." We disagree.

We wrote the only peer-reviewed publication on how to build a system like Apple's - and we concluded the technology was dangerous. We're not concerned because we misunderstand how Apple's system works. The problem is, we understand exactly how it works.

There's now so much evidence from credible, trustworthy people and organisations that Apple's system is bad and dangerous, that I find it hard to believe there are still people cheering Apple on.

External Content
Source RSS or Atom Feed
Feed Location http://www.osnews.com/files/recent.xml
Feed Title OSnews
Feed Link https://www.osnews.com/
Reply 0 comments