Apple Details Reasons to Abandon CSAM-Scanning Tool, More Controversy Ensues
Freeman writes:
In December, Apple said that it was killing an effort to design a privacy-preserving iCloud photo scanning tool for detecting child sexual abuse material (CSAM) on the platform. Originally announced in August 2021, the project had been controversial since its inception. Apple first paused it that September in response to concerns from digital rights groups and researchers that such a tool would inevitably be abused and exploited to compromise the privacy and security of all iCloud users. This week, a new child safety group known as Heat Initiative told Apple that it is organizing a campaign to demand that the company "detect, report, and remove" child sexual abuse material from iCloud and offer more tools for users to report CSAM to the company.
Today, in a rare move, Apple responded to Heat Initiative, outlining its reasons for abandoning the development of its iCloud CSAM scanning feature and instead focusing on a set of on-device tools and resources for users known collectively as "Communication Safety" features.
[...]
In 2021, Thorn lauded Apple's plan to develop an iCloud CSAM scanning feature. Gardner said in an email to CEO Tim Cook on Wednesday, which Apple also shared with WIRED, that Heat Initiative found Apple's decision to kill the feature "disappointing."
[...]
Apple maintains that, ultimately, even its own well-intentioned design could not be adequately safeguarded in practice, and that on-device nudity detections for features like Messages, FaceTime, AirDrop, the Photo picker are a safer alternatives. Apple has also begun offering an application programming interface (API) for its Communication Safety features so third-party developers can incorporate them into their apps. Apple says that the communication platform Discord is integrating the features and that app makers broadly have been enthusiastic about adopting them.
Read more of this story at SoylentNews.