Europol Tells EU Commission: Hey, When It Comes To CSAM, Just Let Us Do Whatever We Want
When it comes to the children, Wu-Tang Europol is for the children.
According to a recent report by Balkan Insight, Europol (Interpol but without the across-the-pond component), the only thing that should matter is the kids. Europeans' rights and privacy should be sidelined - possibly forever! - until European law enforcement can put a dent in CSAM sharing.
The European police agency, Europol, has requested unfiltered access to data that would be harvested under a controversial EU proposal to scan online content for child sexual abuse images and for the AI technology behind it to be applied to other crimes too, according to minutes of a high-level meeting in mid-2022.
The meeting, involving Europol Executive Director Catherine de Bolle and the European Commission's Director-General for Migration and Home Affairs, Monique Pariat, took place in July last year, weeks after the Commission unveiled a proposed regulation that would require digital chat providers to scan client content for child sexual abuse material, or CSAM.
[...]
In the meeting, the minutes of which were obtained under a Freedom of Information request, Europol requested unlimited access to the data produced from the detection and scanning of communications, and that no boundaries be set on how this data is used.
To be fair (and there's really no reason to be), the pitch appeared back when the EU government seemed a lot more certain it would be able to impose client-side scanning on service providers. This imposition would basically make encryption irrelevant if not actually criminal. To engage in scanning of users' data and content, end-to-end encrypted services would need to remove at least one end of the encryption. Once that's removed, anyone (not just the Europol cops) could conceivably eavesdrop on online conversations.
Most of those plans have been discarded, thanks to strong opposition from activist groups and, surprisingly, a large number of EU member nations. On top of that, EU government entities tasked with vetting the proposal also found it would violate rights on the regular.
But it's not just about the kids, although that was Europol's original leverage point. The for the children" argument tends to short-circuit opposition, forcing those opposed to outrageous government mandates to side" with child molesters. It shouldn't work as well as it does, since it's been a cliche for decades. But it still works. And it still works often enough that Europol not only demanded access to combat CSAM but to use this same access to search for criminal activity wholly unrelated to the sexual exploitation of children.
No boundaries" is pretty close to a direct quote from the Europol demand. The law enforcement agency basically asked that it be allowed to access everything for the purposes of possibly detecting other criminal somethings.
All data is useful and should be passed on to law enforcement, there should be no filtering by the [EU] Centre because even an innocent image might contain information that could at some point be useful to law enforcement," the minutes state.
Sure. Why not. Let's not fuck around. Europol wants a police state supported by always-on surveillance of any and all content uploaded by internet service users. Stasi-on-digital-steroids. Considering there's any number of EU members that harbor ill will towards certain residents of their country, granting an international coalition of cops unfiltered access to content would swiftly move past the initial CSAM justification to governments seeking out any content they don't like and punishing those who dared to offend their elected betters.
And because cops will always help out cops, even Europol's supposed voice of reason decided it was better to scan everything than address (apparently comparatively minor) concerns about violating several European privacy laws.
According to an internal Europol document, the agency's own Fundamental Rights Officer raised concerns in June 2023 about possible fundamental rights issues" stemming from biased results, false positives or false negatives", but gave the project the green light anyway.
Kind of takes the fundamental" right out of Fundamental Rights. The officer at least recognized the fundamental" rights. But they ultimately decided Europol should have what it wants, even if it's at the expense of the rights of the policed.
Meanwhile, the revolving door spins. A Europol official has exited the public sector to work for Thorn - the AI/facial recognition tech firm last seen as a major FBI supplier in a GAO report that noted the FBI hasn't bothered to do much training before cutting agents loose to do whatever they want with powerful tools that have a long history of false positives and built-in bias.
According to information available online, Cathal Delaney, a former Europol official who led the agency's Child Sexual Abuse team at its Cybercrime Centre, and who worked on a CSAM AI pilot project, has begun work the US-based organisation Thorn, which develops AI software to target CSAM.
Delaney moved to Thorn immediately after leaving Europol in January 2022 and is listed in the lobby register of the German federal parliament as an employee who represents interests directly".
So, those lobbying directly as public officials for expanded surveillance powers are completely comfortable with moving on to private sector companies that would directly and immediately benefit from expanded surveillance powers. The US is definitely going to have to step up its game if it wants to continue to be the leader of the free world's most revolving doors.
For the moment, it appears Europol's wishes have been denied. But only because the EU government is currently unwilling to completely abdicate its responsibility to the millions of EU residents it serves. Europol is presumably biding its time, waiting for the day when the government finally has the support - or the gall - to tell EU citizens fuck your rights." But until that happens, it appears Europol will continue to advocate for widespread surveillance that goes far beyond the for the children" pitch it used to open this conversation.