Article 6ED5W Apple details reasons to abandon CSAM-scanning tool, more controversy ensues

Apple details reasons to abandon CSAM-scanning tool, more controversy ensues

by
WIRED
from Ars Technica - All content on (#6ED5W)
GettyImages-1462322900-800x533.jpg

Enlarge (credit: Leonardo Munoz/Getty)

In December, Apple said that it was killing an effort to design a privacy-preserving iCloud photo scanning tool for detecting child sexual abuse material (CSAM) on the platform. Originally announced in August 2021, the project had been controversial since its inception. Apple first paused it that September in response to concerns from digital rights groups and researchers that such a tool would inevitably be abused and exploited to compromise the privacy and security of all iCloud users. This week, a new child safety group known as Heat Initiative told Apple that it is organizing a campaign to demand that the company detect, report, and remove" child sexual abuse material from iCloud and offer more tools for users to report CSAM to the company.

wired-logo.png

Today, in a rare move, Apple responded to Heat Initiative, outlining its reasons for abandoning the development of its iCloud CSAM scanning feature and instead focusing on a set of on-device tools and resources for users known collectively as Communication Safety" features. The company's response to Heat Initiative, which Apple shared with WIRED this morning, offers a rare look not just at its rationale for pivoting to Communication Safety, but at its broader views on creating mechanisms to circumvent user privacy protections, such as encryption, to monitor data. This stance is relevant to the encryption debate more broadly, especially as countries like the United Kingdom weigh passing laws that would require tech companies to be able to access user data to comply with law enforcement requests.

Child sexual abuse material is abhorrent and we are committed to breaking the chain of coercion and influence that makes children susceptible to it," Erik Neuenschwander, Apple's director of user privacy and child safety, wrote in the company's response to Heat Initiative. He added, though, that after collaborating with an array of privacy and security researchers, digital rights groups, and child safety advocates, the company concluded that it could not proceed with development of a CSAM-scanning mechanism, even one built specifically to preserve privacy.

Read 9 remaining paragraphs | Comments

External Content
Source RSS or Atom Feed
Feed Location http://feeds.arstechnica.com/arstechnica/index
Feed Title Ars Technica - All content
Feed Link https://arstechnica.com/
Reply 0 comments