Apple Under Pressure From Child Safety Group to Revive Its Anti-CSAM iCloud Scanning Tool
Months after Apple announced that it was planning to end its effort to design a cloud photo-scanning tool to detect child sexual abuse material (CSAM) on iCloud, child safety group Heat Initiative plans to launch a campaign to press Apple to revive it.
Ever since its inception in August 2021, Apple's new anti-CSAM project has been riddled with controversies. In December 2022, the tech giant ultimately decided to kill the project following widespread criticism over privacy concerns.
What's the Controversy About?Apple announced a slew of technological measures to help prevent child abuse across its platforms in 2021. These included measures implemented across family iCloud accounts, iMessages, search, and Siri.
However, it was the most technologically innovative measure that proved to be the most controversial - a system that would detect CSAM stored on iCloud accounts.
Implemented across iPhones, iPads, and Macs, the system would check all images uploaded to iCloud in the US for known child abuse images.
Based on a cryptographic process that took place partly on Apple's servers and partly on the device, the feature was to detect potential CSAM photographs and report them to the National Center for Missing and Exploited Children (NCMEC) and ultimately to US law enforcement.
The move drew a lot of criticism from privacy and security researchers, as well as digital rights groups. There were concerns about the feature being potentially abused to hinder the security and privacy of users around the world.
The group also wants Apple to offer more anti-CSAM tools that the users can use to report such content.Apple, however, argued that none of the anti-CSAM features would endanger its users' privacy. The scanning mechanism wouldn't be able to access any visible non-CSAM images thanks to the clever cryptography, the company claimed.
Around the beginning of September 2021, Apple announced that it would be pausing the rollout of the system to collect input and make improvements before releasing these critically important child safety features". However, the company finally decided to kill the CSAM detection tool for iCloud photos.
A new child safety group named Heat Initiative recently urged Apple to revive the project, announcing that it will launch a campaign to press the company into detecting, removing, and reporting child sexual abuse material from iCloud.
Apple's Response to Heat InitiativeThe tech giant responded to Heat Initiative's demands, outlining why it abandoned the anti-CSAM project for iCloud.
Child sexual abuse material is abhorrent and we are committed to breaking the chain of coercion and influence that makes children susceptible to it.Erik Neuenschwander, Apple's director of user privacy and child safetyHe went on to add that after discussing the issue with a host of privacy and security researchers, digital rights groups, and child safety advocates, Apple decided not to proceed with the development of a CSAM-scanning mechanism.
Scanning every user's privately stored iCloud data would create new threat vectors for data thieves to find and exploit.AppleNeuenschwander also pointed out that it could potentially lead to a slippery slope" of unintended consequences.
Apple diverted its anti-CSAM efforts and investments to its Communication Safety" features, which it rolled out last December. These include a set of on-device tools and resources to help stop child exploitation and reduce CSAM creation.
However, it remains to be seen if Apple will stick to its resolve to refrain from data scanning in its efforts to combat child abuse.
The post Apple Under Pressure From Child Safety Group to Revive Its Anti-CSAM iCloud Scanning Tool appeared first on The Tech Report.