Article 6FN51 Breaking Encryption To Aid Client-Side Scanning Isn’t The Solution To The CSAM Problem

Breaking Encryption To Aid Client-Side Scanning Isn’t The Solution To The CSAM Problem

by
Tim Cushing
from Techdirt on (#6FN51)
Story Image

Plenty of legislators and law enforcement officials seem to believe there's only one acceptable solution to the CSAM (child sexual abuse material) problem: breaking encryption.

They may state some support for encryption, but when it comes to this particular problem, many of these officials seem to believe everyone's security should be compromised just so a small percentage of internet users can be more easily observed and identified. They tend to talk around the encryption issue, focusing on client-side scanning of user content - a rhetorical tactic that willfully ignores the fact that client-side scanning would necessitate the elimination of one end of end-to-end encryption to make this scanning possible.

The issue at the center of these debates often short-circuits the debate itself. Since children are the victims, many people reason no sacrifice (even if it's a government imposition) is too great. Those who argue against encryption-breaking mandates are treated as though they'd rather aid and abet child exploitation than allow governments to do whatever they want in response to the problem.

Plenty of heat has been directed Meta's way in recent years, due to its planned implementation of end-to-end encryption for Facebook Messenger users. And that's where the misrepresentation of the issue begins. Legislators and law enforcement officials claim the millions of CSAM reports from Facebook will dwindle to almost nothing if Messenger is encrypted, preventing Meta from seeing users' communications.

This excellent post by cybersecurity expert Susan Landau for Lawfare punctures holes in these assertions, pointing out that the millions" of reports Facebook generates annually are hardly indicative of widespread sexual abuse of children.

Yes, the transition of CSAM sharing to online communication services has resulted in a massive increase in reports to NCMEC (National Center for Missing and Exploited Children).

The organization received 29 million reports of online sexual exploitation in 2021, a10-foldincrease over a decade earlier. Meanwhile the number of video filesreportedto NCMEC increased over 40 percent between 2020 and 2021.

But that doesn't necessarily mean there are more children being exploited than ever before. Nor does it mean Facebook sees more CSAM than other online services, despite its massive user base.

Understanding the meaning of the NCMEC numbers requires careful examination. Facebookfoundthat over 90 percent of the reports the company filed with NCMEC in October and November 2021 were the same as or visually similar to previously reported content." Half of the reports were based on just six videos.

As Landau is careful to point out, that doesn't mean the situation is acceptable. It just means tossing around phrases like 29 million reports" doesn't necessarily mean millions of children are being exploited or millions of users are sharing CSAM via these services.

Then there's the uncomfortable fact that a sizable percentage of the content reported to NMCEC doesn't actually involve any exploitation of minors by adults. Landau quotes from Laura Draper's 2022 report on CSAM and the rise of encrypted services. In that report, Draper points out that some of the reported content is generated by minors for other minors: i.e., sexting.

Draper observed that CSAE consists of four types of activities exacerbated by internet access: (a) CSAM, which is the sharing of photos or videos of child sexual abuse imagery; (b) perceived first-person (PFP) material, which is nude imagery taken by children of themselves and then shared, often much more widely than the child intended; (c) internet-enabled child sex trafficking; and (d) live online sexual abuse of children.

While these images are considered child porn" (to use an antiquated term), they are not actually images take by sexual abusers, which means they aren't actually CSAM, even if they're treated as such by NMCEC and reported as such by communication services. In these cases, Landau suggests more education of minors to inform them of the unintended consequences of these actions, first and foremost being that they can't control who these images are shared with once they've shared them with anyone else.

The rest of the actions on that list are indeed extremely disturbing. But, as Landau (and Draper) suggest, there are better solutions already available that don't involve undermining user security by removing encryption or undermining their privacy by subjecting them to client-side scanning.

[C]onsider the particularly horrific crime in which there is live streaming of a child being sexually abused according to requests made by a customer. The actual act of abuse often occurs abroad. In such cases, aspects of the case can be investigated even in the presence of E2EE. First, the video stream is high bandwidth from the abuser to the customer but very low bandwidth the other way, with only an occasional verbal or written request. Such traffic stands out from normal communications; it looks neither like a usual video communication nor a showing of a film. And the fact that the trafficker must publicly advertise for customers provides law enforcement another route for investigation.

Unfortunately, government officials tend to portray E2EE as the root of the CSAM problem, rather than just something that exists alongside a preexisting problem. Without a doubt, encryption can pose problems for investigators. But there are a plethora of options available that don't necessitate making everyone less safe and secure just because abusers use encrypted services in order to avoid immediate detection.

Current processes need work as well. As invaluable as NCMEC is, it's also contributing to a completely different problem. Hash matching is helpful but it's not infallible. Hash collisions (where two different images generate identical hashes) are possible. Malicious actors could create false collisions to implicate innocent people or hide their sharing of illicit material. False positives do happen. Unfortunately, at least one law enforcement agency is treating the people on the receiving end of erroneous flagging as criminal suspects.

Responding to an information request from ICCL, the Irish policereportedthat NCMEC had provided 4,192 referrals in 2020. Of these, 409 of the cases were actionable and 265 cases were completed. Another 471 referrals were Not Child Abuse Material." The Irish police nonetheless stored (1) suspect email address, (2) suspect screen name, [and] (3) suspect IP address." Now 471 people have police records because a computer program incorrectly flagged them as having CSAM.

Stripping encryption and forcing service providers to engage in client-side scanning will only increase the number of false positives. But much of what's being proposed - both overseas and here in the United States - takes the short-sighted view that encryption must go if children are to be saved. To come up with better solutions, legislators and law enforcement need to be able to see past the barriers that immediately present themselves. Rather than focus on short-term hurdles, they need to recognize online communication methods will always be in a state of fluctuation. What appears to be the right thing to do now may become utterly worthless in the near future.

Think differently. Think long term. Think about protecting the privacy and security of all members of society-children and adults alike. By failing to consider the big picture, the U.K.Online Safety Acthas taken a dangerous, short-term approach to a complex societal problem. The EU and U.S. have the chance to avoid the U.K.'s folly; they should do so. The EU proposal and the U.S. bills are not sensible ways to approach the public policy concerns of online abetting of CSAE. Nor are these reasonable approaches in view of the cyber threats our society faces. The bills should be abandoned, and we should pursue other ways of protecting both children and adults.

The right solution now isn't to make everyone less safe and secure. Free world governments shouldn't be in such a hurry to introduce mandates that lend themselves to abuse by government entities and used to justify even more abusive surveillance methods deployed by autocrats and serial human rights abusers. Yes, the problem is important and should be of utmost concern. But that doesn't mean governments should, for all intents and purposes, outlaw encryption just because it seems to be quickest, easiest solution to a problem that's often misrepresented and misperceived.

External Content
Source RSS or Atom Feed
Feed Location https://www.techdirt.com/techdirt_rss.xml
Feed Title Techdirt
Feed Link https://www.techdirt.com/
Reply 0 comments