The Trump FTC’s War On Porn Just Ensured That Accused CSAM Offenders Will Walk Free
Well, they finally did it. A federal agency finally shattered the precarious base that upholds the edifice of prosecutions for child sex abuse material (CSAM) in America. That agency is the Federal Trade Commission (FTC), which just entered into a deeply problematic settlement with a major online content platform for doing little to block" CSAM and got a federal judge to approve it. On its face, the order may sound like a win. But, in fact, it will help accused offenders walk free, perversely undermining its own stated purpose.
I'm going to discuss the settlement order in a two-part series. In this post, I'll describe what the order requires Aylo to do and explain why that's going to create a huge headache for prosecutors in CSAM cases. In the next post, I'll discuss why the settlement isn't really about fighting CSAM anyway; it's a stalking horse for the Trump FTC's ulterior agenda, which comes straight out of Project 2025.
The FTC is the nation's consumer protection watchdog. It lacks criminal enforcement authority; that's the Department of Justice's job. As such, the FTC is not in the business of investigating reports of CSAM from online platforms and prosecuting suspected CSAM purveyors. Yet the Commission decided to push the envelope of its authority by dipping its toe into those unfamiliar waters. By refusing to stay in their lane, the FTC just made it harder for the people who actually do prosecute crimes to bring CSAM offenders to justice. Meanwhile, CSAM's illegality and obloquy will let the Commission disguise a power grab that's really about controlling legal speech online.
As I know from my own research, everyone who is familiar with the ecosystem of reporting, investigating, and prosecuting online CSAM steers well clear of disturbing the fragile edifice underpinning CSAM prosecutions in the U.S. - to wit, that platforms' common practice of scanning their services for CSAM is completely voluntary, not the result of government compulsion. That voluntariness is sacrosanct, as I last explained in Techdirt scarcely a year ago, because of the Fourth Amendment. Typically, its prohibition against unreasonable searches and seizures applies only to the government, not to private actors. But if platforms search users' uploaded files for CSAM at the government's behest, not of their own volition, they stop being private entities and become agents of the government. That turns all those scans into unconstitutional warrantless searches, meaning anything they turn up isn't admissible as evidence in court, making it harder to convict anyone caught by the scans. The government's compulsion ends up being self-defeating.
That's why it's crucial to avoid government interference with online platforms' decisionmaking about whether and how to search their services for illegal content. But the FTC just fucked it all up.
What Happened?
On September 8, a federal judge in Utah approved a proposed stipulated order against Aylo that had been filed a few days earlier by the FTC and the Utah Consumer Protection Division (CPD). You may not know Aylo's name, but you know its product: it operates Pornhub, the world's most popular porn website, along with numerous other NSFW properties. To announce the settlement, the FTC issued a press release that linked to the proposed order and the complaint against Aylo. (The full court docket is here, thanks to the folks at the Free Law Project.)
The FTC and Utah CPD allege that Aylo was committing unfair and deceptive trade practices by hosting thousands of pieces of CSAM and non-consensual pornography (revenge porn, rape videos, etc.), despite saying it prohibited those kinds of content. (The complaint calls the latter NCM," but I'll call it by the more-common acronym NCII, for non-consensual intimate imagery.) The investigation stemmed from a December 2020 New York Times opinion column wherein columnist Nick Kristof asserted that CSAM and NCII were rampant problems on Pornhub and other adult sites under the same corporate umbrella (then MindGeek, now Aylo since 2023).
To settle the allegations, Aylo agreed to make a ton of changes, including significant reforms aimed at removing and preventing CSAM and NCII on its sites. It also agreed to pay the Utah CPD $5 million now, and another $10 million if it fails to comply with the order, which the Utah court retains jurisdiction to enforce.
Reducing the availability of CSAM and NCII on major porn sites is a worthwhile aim. But by mandating that Aylo monitor all files uploaded to its various services, the settlement will backfire at its own ostensible purpose by making it harder to convict anybody caught trying to upload CSAM or NCII to Aylo's sites.
What Does the Order Require?
The order is over 60 pages long, and it requires a lot of things that fall outside the scope of this discussion (but some of which I'll cover in part two). The part that's a problem for CSAM prosecutions comes in Section III of the order, which requires Aylo to establish, implement, and thereafter maintain" a Mandated Program to Prevent the Posting and Proliferation of CSAM and NCM." (That starts at page 11 of the order.) One of the requirements of this program (at p. 17) is that Aylo must start [u]tilizing available tools and technologies to review Content [defined as any depiction of sexually explicit conduct'] to determine whether it is actual or suspected CSAM or NCM prior to its publication or otherwise making it available to a consumer on [Aylo's sites], including, but not limited to ... [c]omparing Content, via internal or external tools, to Content previously identified and/or fingerprinted or otherwise marked (whether by any Defendant or another entity) as actual or suspected CSAM or NCM."
Other elements of the mandated program include requiring human moderators who review content to watch/listen to each file in its entirety, or alternatively read an entire transcript of the content (p. 18), and requiring Aylo to implement [p]olicies, practices, procedures, and technical measures designed to ensure the consistent and thorough review of Content to determine whether it is actual or suspected CSAM or NCM, ... before that Content is published on any Covered Service" (p. 19).
Put simply, these provisions constitute a mandate to scan all uploaded files for CSAM or NCII. That's what content review" means here1. As required by the FTC and the Utah CPD, agreed to by Aylo, and endorsed by a federal court, Aylo must search all uploaded files to check if they're a match to known CSAM or NCII, whether the files are uploaded by a user, a content partner, or a model who contributes content to the site. Noncompliance with the order will cost Aylo a $10 million penalty that's currently suspended (see p. 56).
Scanning for CSAM is a standard practice that is already widespread among many adult sites and user-generated content (UGC) sites generally. The best-known example is Microsoft's PhotoDNA software, which finds matches to known CSAM. Companies like Google and Meta also have tools for detecting new, previously unseen instances of CSAM. (Some platforms also search for known NCII, a practice likely to expand under the TAKE IT DOWN Act.) Crucially, however, that widespread standard practice is voluntary. The reason it has never before, to my knowledge, been compelled by any U.S. authority is because of the Fourth Amendment.
The government cannot force a private actor to carry out a search the government could not constitutionally conduct itself; if it could, the Fourth Amendment would be a dead letter. When the government coerces a private actor to carry out a search, the private entity becomes an agent of the government, and its searches must comport with the Fourth Amendment. Generally, a search requires a warrant to be reasonable. Of course, privately-owned online platforms can't get warrants. So when they search users' files at the government's behest, all those scans become mass warrantless searches of the contents of users' files, in violation of the Constitution.
All of this was laid out in a landmark 2016 ruling by then-Judge Neil Gorsuch while he was a judge on the Tenth Circuit - the appeals court for the very same federal district court in Utah that just rubber-stamped the Aylo order. That would've been great ammo for Aylo if they'd fought back instead of settling.
Why Care About the Privacy Rights of Alleged Pedophiles?
Who cares about the rights of accused CSAM offenders? Beyond the fact that even the worst among us have constitutional rights, this matters because violating the accused's Fourth Amendment rights makes it harder to convict them for the terrible things they're accused of.
The remedy for an unconstitutional search is suppression of the evidence obtained via the illegal search, including any evidence turned up as a result of the initial unlawful search. That is, if the government compels a private platform to scan user files, and the scan turns up CSAM, the CSAM cannot then be used against the user in a prosecution for that CSAM, which, needless to say, makes securing a conviction more difficult. This is why the federal statute requiring platforms to report CSAM when they find it on their services (namely, to a clearinghouse called the National Center for Missing and Exploited Children, or NCMEC) is very, very explicit in saying the statute does not require any service to monitor the contents of user files for CSAM.
Mandating that private platforms scan for CSAM is a completely self-defeating policy. That is why CSAM scans must be voluntary. The entire ecosystem of online platforms scanning for CSAM - which results in tens of millions of reports to NCMEC per year - depends entirely on the voluntariness of those searches. If they're not voluntary, that whole system comes crashing down.
This Fourth Amendment agency dilemma" (to borrow the title of a great 2021 paper) is very well-understood among the people who, unlike the Federal Trade Commission, actually work on fighting online CSAM as their daily jobs. As I wrote in a research publication last year, every actor in that ecosystem - platforms, law enforcement, the federal government, NCMEC - is excruciatingly aware of the Fourth Amendment government agent doctrine and takes great care to respect the voluntary nature of platforms' scanning choices. Now the FTC and the state of Utah have waltzed in, slapped all those people in the face, and gotten a federal court to order Aylo to do the very thing that everyone who knew what they were doing had scrupulously avoided for years.
Thanks to the FTC, all the scans that Aylo conducts under this stipulated order will be mass warrantless searches of the contents of other people's files. If anyone gets arrested and prosecuted due to CSAM or NCII that's turned up in those scans, they are, ironically, likelier to walk free precisely because of the FTC's order turning Aylo into an agent of the government.
This will also affect cases that don't involve Aylo. As I'll explain more in part two, the FTC regulates via consent decrees. Inserting a scanning mandate into the Aylo order, based on allegations that Aylo did little to block CSAM or NCII uploads, signals to other UGC-driven sites (not just adult sites) that the FTC expects CSAM/NCII scanning as a baseline. After the FTC put Aylo's head on a pike to serve as a warning to others, future criminal defendants ensnared by scanning can argue that even if their platform's scans used to be voluntary, they aren't anymore. The argument will be stronger for any platforms that only began scanning after the Aylo order. Even if those motions to suppress ultimately fail, they'll still be a headache for prosecutors, and if any of them do succeed, those defendants will have the FTC to thank.
How Did This Happen, and What Comes Next?
How did this ticking constitutional time bomb make it into the final order? The simplest explanation is that these are consumer protection attorneys who didn't have the necessary Fourth Amendment knowledge to spot the government agency problem. As any lawyer knows, having deep expertise in one area of the law doesn't mean you can necessarily issue-spot all the other things lurking in a matter you're handling. But that's why you either loop in teammates who have different skills or stay in your damn lane.
Frustratingly, I think the FTC did talk to criminal prosecutors. The Utah court approved the Aylo settlement on the very same day a criminal defendant was sentenced in a different federal court on sex trafficking charges for his role in GirlsDoPorn, a one-time content partner of Aylo's to which a whole page of the Aylo complaint is dedicated. That's quite the coincidence, so maybe the FTC lawyers coordinated the timing with the GirlsDoPorn prosecutors. Still, why expect them to find and fix a problem buried partway into a 60-page order in someone else's case?
In any event, here we are. Now that the order has been approved by the court, it's not clear what can be done to fix it. Aylo, the FTC, and the Utah CPD have waive[d] all rights to appeal or otherwise challenge or contest the validity of this Order" (p. 4). Maybe if there's a congressional hearing where Aylo, the FTC, and the Utah CPD are asked to explain what content review" means, the court might sua sponte reconsider the order. I'm not holding my breath. In the meantime, Christmas has come early for the criminal defense bar. Heckuva job, FTC.
- Or maybe the mandate to review content" doesn't really require Aylo to search all uploads. But if it doesn't mean that, what does it mean? If Aylo isn't on notice of what it must do to comply, then the language is vague, which is a separate legal problem.