Article 6Q37W Suing Apple To Force It To Scan iCloud For CSAM Is A Catastrophically Bad Idea

Suing Apple To Force It To Scan iCloud For CSAM Is A Catastrophically Bad Idea

by
Mike Masnick
from Techdirt on (#6Q37W)
Story Image

There's a new lawsuit in Northern California federal court that seeks to improve child safety online but could end up backfiring badly if it gets the remedy it seeks. While the plaintiff's attorneys surely mean well, they don't seem to understand that they're playing with fire.

The complaint in the putative class action asserts that Apple has chosen not to invest in preventive measures to keep its iCloud service from being used to store child sex abuse material (CSAM) while cynically rationalizing the choice as pro-privacy. This decision allegedly harmed the Jane Doe plaintiff, a child whom two unknown users contacted on Snapchat to ask for her iCloud ID. They then sent her CSAM over iMessage and got her to create and send them back CSAM of herself. Those iMessage exchanges went undetected, the lawsuit says, because Apple elected not to employ available CSAM detection tools, thus knowingly letting iCloud become a safe haven for CSAM offenders." The complaint asserts claims for violations of federal sex trafficking law, two states' consumer protection laws, and various torts including negligence and products liability.

Here are key passages from the complaint:

[Apple] opts not to adopt industry standards for CSAM detection... [T]his lawsuit ... demands that Apple invest in and deploy means to comprehensively ... guarantee the safety of children users. ... [D]espite knowing that CSAM is proliferating on iCloud, Apple has chosen not to know" that this is happening ... [Apple] does not ... scan for CSAM in iCloud. ... Even when CSAM solutions ... like PhotoDNA[] exist, Apple has chosen not to adopt them. ... Apple does not proactively scan its products or services, including storages [sic] or communications, to assist law enforcement to stop child exploitation. ...

According to [its] privacy policy, Apple had stated to users that it would screen and scan content to root out child sexual exploitation material. ... Apple announced a CSAM scanning tool, dubbed NeuralHash, that would scan images stored on users' iCloud accounts for CSAM ... [but soon] Apple abandoned its CSAM scanning project ... it chose to abandon the development of the iCloud CSAM scanning feature ... Apple's Choice Not to Employ CSAM Detection ... Is a Business Choice that Apple Made. ... Apple ... can easily scan for illegal content like CSAM, but Apple chooses not to do so. ... Upon information and belief, Apple ... allows itself permission to screen or scan content for CSAM content, but has failed to take action to detect and report CSAM on iCloud. ...

[Questions presented by this case] include: ... whether Defendant has performed its duty to detect and report CSAM to NCMEC [the National Center for Missing and Exploited Children]. ... Apple ... knew or should have known that it did not have safeguards in place to protect children and minors from CSAM. ... Due to Apple's business and design choices with respect to iCloud, the service has become a go-to destination for ... CSAM, resulting in harm for many minors and children [for which Apple should be held strictly liable] ... Apple is also liable ... for selling defectively designed services. ... Apple owed a duty of care ... to not violate laws prohibiting the distribution of CSAM and to exercise reasonable care to prevent foreseeable and known harms from CSAM distribution. Apple breached this duty by providing defective[ly] designed services ... that render minimal protection from the known harms of CSAM distribution. ...

Plaintiff [and the putative class] ... pray for judgment against the Defendant as follows: ... For [an order] granting declaratory and injunctive relief to Plaintiff as permitted by law or equity, including: Enjoining Defendant from continuing the unlawful practices as set forth herein, until Apple consents under this court's order to ... [a]dopt measures to protect children against the storage and distribution of CSAM on the iCloud ... [and] [c]omply with quarterly third-party monitoring to ensure that the iCloud product has reasonably safe and easily accessible mechanisms to combat CSAM...."

What this boils down to: Apple could scan iCloud for CSAM, and has said in the past that it would and that it does, but in reality it chooses not to. The failure to scan is a wrongful act for which Apple should be held liable. Apple has a legal duty to scan iCloud for CSAM, and the court should make Apple start doing so.

This theory is perilously wrong.

The Doe plaintiff's story is heartbreaking, and it's true that Apple has long drawn criticism for its approach to balancing multiple values such as privacy, security, child safety, and usability. It is understandable to assume that the answer is for the government, in the form of a court order, to force Apple to strike that balance differently. After all, that is how American society frequently remedies alleged shortcomings in corporate practices.

But this isn't a case about antitrust, or faulty smartphone audio, or virtual casino apps (as in other recent Apple class actions). Demanding that a court force Apple to change its practices is uniquely infeasible, indeed dangerous, when it comes to detecting illegal material its users store on its services. That's because this demand presents constitutional issues that other consumer protection matters don't. Thanks to the Fourth Amendment, the courts cannot force Apple to start scanning iCloud for CSAM; even pressuring it to do so is risky. Compelling the scans would, perversely, make it way harder to convict whoever the scans caught. That's what makes this lawsuit a catastrophically bad idea.

(The unconstitutional remedy it requests isn't all that's wrong with this complaint, mind. Let's not get into the Section 230 issues it waves away in two conclusory sentences. Or how it mistakes language in Apple's privacy policy that it may" use users' personal information for purposes including CSAM scanning, for an enforceable promise that Apple would do that. Or its disingenuous claim that this isn't an attack on end-to-end encryption. Or the factually incorrect allegation that Apple does not proactively scan its products or services" for CSAM at all, when in fact it does for some products. Let's set all of that aside. For now.)

The Fourth Amendment to the U.S. Constitution protects Americans from unreasonable searches and seizures of our stuff, including our digital devices and files. Reasonable" generally means there's a warrant for the search. If a search is unreasonable, the usual remedy is what's called the exclusionary rule: any evidence turned up through the unconstitutional search can't be used in court against the person whose rights were violated.

While the Fourth Amendment applies only to the government and not to private actors, the government can't use a private actor to carry out a search it couldn't constitutionally do itself. If the government compels or pressures a private actor to search, or the private actor searches primarily to serve the government's interests rather than its own, then the private actor counts as a government agent for purposes of the search, which must then abide by the Fourth Amendment, otherwise the remedy is exclusion.

If the government - legislative, executive, or judiciary - forces a cloud storage provider to scan users' files for CSAM, that makes the provider a government agent, meaning the scans require a warrant, which a cloud services company has no power to get, making those scans unconstitutional searches. Any CSAM they find (plus any other downstream evidence stemming from the initial unlawful scan) will probably get excluded, but it's hard to convict people for CSAM without using the CSAM as evidence, making acquittals likelier. Which defeats the purpose of compelling the scans in the first place.

Congress knows this. That's why, in the federal statute requiring providers to report CSAM to NCMEC when they find it on their services, there's an express disclaimer that the law does not mean they must affirmatively search for CSAM. Providers of online services may choose to look for CSAM, and if they find it, they have to report it - but they cannot be forced to look.

Now do you see the problem with the Jane Doe lawsuit against Apple?

This isn't a novel issue. Techdirt has covered it before. It's all laid out in a terrific 2021 paper by Jeff Kosseff. I have also discussed this exact topic over and over and over and over and over and over again. As my latest publication (based on interviews with dozens of people) describes, all the stakeholders involved in combating online CSAM - tech companies, law enforcement, prosecutors, NCMEC, etc. - are excruciatingly aware of the government agent" dilemma, and they all take great care to stay very far away from potentially crossing that constitutional line. Everyone scrupulously preserves the voluntary, independent nature of online platforms' decisions about whether and how to search for CSAM.

And now here comes this lawsuit like the proverbial bull in a china shop, inviting a federal court to destroy that carefully maintained and exceedingly fragile dynamic. The complaint sneers at Apple's business choice" as a wrongful act to be judicially reversed rather than something absolutely crucial to respect.

Fourth Amendment government agency doctrine is well-established, and there are numerous cases applying it in the context of platforms' CSAM detection practices. Yet Jane Doe's counsel don't appear to know the law. For one, their complaint claims that Apple does not proactively scan its products or services ... to assist law enforcement to stop child exploitation." Scanning to serve law enforcement's interests would make Apple a government agent. Similarly, the complaint claims Apple has failed to take action to detect and report CSAM on iCloud," and asks whether Defendant has performed its duty to detect and report CSAM to NCMEC." This conflates two critically distinct actions. Apple does not and cannot have any duty to detect CSAM, as expressly stated in the statute imposing a duty to report CSAM. It's like these lawyers didn't even read the entire statute, much less any of the Fourth Amendment jurisprudence that squarely applies to their case.

Any competent plaintiff's counsel should have figured this out before filing a lawsuit asking a federal court to make Apple start scanning iCloud for CSAM, thereby making Apple a government agent, thereby turning the compelled iCloud scans into unconstitutional searches, thereby making it likelier for any iCloud user who gets caught to walk free, thereby shooting themselves in the foot, doing a disservice to their client, making the situation worse than the status quo, and causing a major setback in the fight for child safety online.

The reason nobody's filed a lawsuit like this against Apple to date, despite years of complaints from left, right, and center about Apple's ostensibly lackadaisical approach to CSAM detection in iCloud, isn't because nobody's thought of it before. It's because they thought of it and they did their fucking legal research first. And then they backed away slowly from the computer, grateful to have narrowly avoided turning themselves into useful idiots for pedophiles. But now these lawyers have apparently decided to volunteer as tribute. If their gambit backfires, they'll be the ones responsible for the consequences.

Riana Pfefferkorn is a policy fellow atStanford HAIwho has written extensively about the Fourth Amendment's application to online child safety efforts.

External Content
Source RSS or Atom Feed
Feed Location https://www.techdirt.com/techdirt_rss.xml
Feed Title Techdirt
Feed Link https://www.techdirt.com/
Reply 0 comments