Article 6EFT7 Apple Irritates Interest Groups, Law Enforcement With Its (Reasonable) Refusal To Restart Its Client-Side Scanning Program

Apple Irritates Interest Groups, Law Enforcement With Its (Reasonable) Refusal To Restart Its Client-Side Scanning Program

by
Tim Cushing
from Techdirt on (#6EFT7)
Story Image

After years of irritating the DOJ with its refusal to compromise encryption, Apple suddenly went the other way after receiving criticism over its perceived inability to stop the distribution of CSAM (child sexual abuse material) via its devices and services.

For a very brief moment, Apple decided it would no longer be a world leader in privacy and security. It declared it would begin engaging in client-side scanning of users' content in hopes of preventing the spread of CSAM.

Mere moments later, it abandoned this never-implemented plan, citing the security and privacy flaws client-side scanning would create. While it's always a good idea to do what you can to prevent CSAM distribution, if that effort means subjecting every device user to unpatchable, deliberately created security holes, then it's not worth doing.

Why? Because it just creates an exploit governments can use to search out other content they don't care for, like dissenting views, work product created by critical journalists, or anything that might be used to silence content that doesn't comply with the government's preferred narrative.

Apple had good reasons for attempting to limit the distribution of CSAM. It also had good reason to shut down this project before it began after realizing the negative, unintended consequences would likely outweigh whatever public good it might create by deliberately compromising its own encryption.

Needless to say, this has produced another set of enemies for Apple. Governments all over the world sincerely hoped voluntary client-side scanning by a major US tech company would allow them to pass laws demanding similar compliance from other tech companies. Groups involved in deterring the sharing of CSAM hoped Apple's proactive scanning would prompt others to similarly compromise the security and privacy of their customers - something that might make it a bit easier to round up child abusers and deter future victimization.

Apple appears to have moved past the mothball" stage to a permanent rejection of client-side scanning efforts. That move has generated a new round of criticism. This time it's not a government demanding Apple do more. It's child safety group Heat Initiative, which sent Apple an email criticizing its move away from proactive client-side scanning of uploaded content.

Heat Initiative wanted answers. It got them... but not the answers it wanted. Not only that, but Apple has chosen to make its answer public, as Lily Hay Newman reports for Wired:

Today, in a rare move, Apple responded to Heat Initiative, outlining its reasons for abandoning the development of its iCloud CSAM scanning feature and instead focusing on aset of on-device tools and resources for usersknown collectively as Communication Safety features. The company's response to Heat Initiative, which Apple shared with WIRED this morning, offers a rare look not just at its rationale for pivoting to Communication Safety, but at its broader views on creating mechanisms to circumvent user privacy protections, such as encryption, to monitor data. This stance is relevant to the encryption debate more broadly, especially as countries like the United Kingdom weigh passing laws that would require tech companies to be able to access user data to comply with law enforcement requests.

Heat Initiative may have preferred this communication (especially since it ended with a powerful rebuttal of the group's demands) remained private. Apple has chosen to make this response [PDF] public because it makes the points groups and governments tend to desire aren't made publicly, like the fact that engaging in client-side scanning tends to benefit autocrats, surveillance state participants, and, yes, other criminals far more than it benefits even the most helpless members of our society: the children being exploited by sexual abuse.

Scanning of personal data in the cloud is regularly used by companies to monetize the information of their users. While some companies have justified those practices, we've chosen a very different path - one that prioritizes the security and privacy of our users. Scanning every user's privately stored iCloud content would in our estimation pose serious unintended consequences for our users. Threats to user data are undeniably growing - globally the total number of data breaches more than tripled between 2013 and 2021, exposing 1.1 billion personal records in 2021 alone. As threats become increasingly sophisticated, we are committed to providing our users with the best data security in the world, and we constantly identify and mitigate emerging threats to users' personal data, on device and in the cloud. Scanning every user's privately stored iCloud data would create new threat vectors for data thieves to find and exploit.

It would also inject the potential for a slippery slope of unintended consequences. Scanning for one type of content, for instance, opens the door for bulk surveillance and could create a desire to search other encrypted messaging systems across content types (such as images, videos, text, or audio) and content categories. How can users be assured that a tool for one type of surveillance has not been reconfigured to surveil for other content such as political activity or religious persecution? Tools of mass surveillance have widespread negative implications for freedom of speech and, by extension, democracy as a whole. Also, designing this technology for one government could require applications for other countries across new data types.

Scanning systems are also not foolproof and there is documented evidence from other platforms that innocent parties have been swept into dystopian dragnets that have made them victims when they have done nothing more than share perfectly normal and appropriate pictures of their babies.

These are all good answers. And they're definitely not the answers Heat Initiative was hoping to receive. But they're the most honest answers - ones that don't pretend what this group wants will somehow be workable and free of negative consequences.

The unsurprising twist is that Heat Initiative already knew Apple would raise legitimate concerns about client-side scanning, rather than simply do what the activist group wanted it to do. Instead of engaging in the issue honestly and directly as Apple has done, Heat Initiative has already moved forward with a plan to (dishonestly) portray Apple as a willing participant in the spread of CSAM:

A child advocacy group, the Heat Initiative, has raised $2 million for a new national advertising campaign calling on Apple to detect, report and remove child sexual abuse materials from iCloud, its cloud storage platform.

Next week, the group will release digital advertisements on websites popular with policymakers in Washington, such as Politico. It will also put up posters across San Francisco and New York that say: Child sexual abuse material is stored on iCloud. Apple allows it."

The thing is: Apple doesn't allow it. Apple simply refuses to undermine every user's privacy and security to detect what is assuredly a very small amount of illegal content being transmitted via its services. Apple's argument - stated directly and intelligently to Heat Initiative - is simply this: breaking encryption results in broken encryption. And that can be exploited by governments and criminals just as easily as it can be utilized to detect CSAM.

There is no perfect solution that benefits every stakeholder in CSAM cases. But what should never be considered the most acceptable solution is anything that converts innocent users into fodder for government oppression. That's what Apple wants to prevent. And, for that, it will continue to be labeled as a participant in child sexual abuse by intellectually dishonest entities like Heat Initiative.

External Content
Source RSS or Atom Feed
Feed Location https://www.techdirt.com/techdirt_rss.xml
Feed Title Techdirt
Feed Link https://www.techdirt.com/
Reply 0 comments