We built a system like Apple’s to flag child sexual abuse material – and concluded the tech was dangerous
Earlier this month, Apple unveiled a system that would scan iPhone and iPad photos for child sexual abuse material (CSAM). The announcement sparked a civil liberties firestorm, and Apple's own employees have been expressing alarm. The company insists reservations about the system are rooted in misunderstandings." We disagree.
We wrote the only peer-reviewed publication on how to build a system like Apple's - and we concluded the technology was dangerous. We're not concerned because we misunderstand how Apple's system works. The problem is, we understand exactly how it works.
There's now so much evidence from credible, trustworthy people and organisations that Apple's system is bad and dangerous, that I find it hard to believe there are still people cheering Apple on.