Senator Durbin’s ‘STOP CSAM Act’ Has Some Good Ideas… Mixed In With Some Very Bad Ideas That Will Do More Harm Than Good

It's protect the children" season in Congress with the return of KOSA and EARN IT, two terrible bills that attack the internet, and rely on people's ignorance of how things actually work to pretend they're making the internet safer, when they're not. Added to this is Senator Dick Durbin's STOP CSAM Act, which he's been touting since February, but only now has officially put out a press release announcing the bill (though, he hasn't released the actual language of the bill, because that would actually be helpful to people analyzing it).
CSAM is child sexual abuse material," though because every bill needs a dumb acronym, in this case it's the Strengthening Transparency and Obligation to Protect Children Suffering from Abuse and Mistreatment Act of 2023.
There is a section by section breakdown of the bill, though, along with a one pager summary. And, given how bad so many other internet protect the children" bills there are, this one is... not nearly as bad. It actually has a few good ideas, but also a few really questionable bits. Also, the framing of the whole thing is a bit weird:
From March 2009 to February 2022, the number of victims identified in child sexual abuse material (CSAM) rose almost ten-fold, from 2,172 victims to over 21,413 victims. From 2012 to 2022, the volume of reports to the National Center for Missing & Exploited Children's CyberTipline concerning child sexual exploitation increased by a factor of 77 (415,650 reports to over 32 million reports).
Clearly, any child sexual abuse material is too much, but it's not at all clear to me that the numbers represented here show an actual increase in victims of child sexual abuse material, or merely a much bigger infrastructure and setup for reporting CSAM material. I mean, from March of 2009 to February of 2022 is basically the period in which social media went mainstream, and with it, much better tracking and reporting of such material.
I mean, back in March of 2009, the tools to track, find and report CSAM were in their infancy. Facebook didn't start using PhotoDNA (which was only developed in 2009) until the middle of 2011. It's unclear when Google started using it as well, but this announcement suggests it was around 2013 - noting that recently" the company started using encrypted fingerprints" of child sexual abuse images into a cross-industry database" (which describes PhotoDNA).
This is what's frustrating in all of this. For years, there were complaints that these companies didn't report enough CSAM, so they built better tools that found more... and now the media and politicians are assuming that the increase in reporting means an increase in actual victimization. Yet, it's unclear if that's actually the case. It's just as (if not more) likely that since the companies are getting better at finding and reporting, that this is just presenting a more accurate number of what's out there, and not any indication of whether or not the problem has grown.
Notice what's not talked about? It's not mentioned how much law enforcement has done to actually track down, arrest, and prosecute the perpetrators. That's the stat that matters. But it's missing.
Anyway, again, stopping CSAM remains important, and there are some good things in Durbin's outline (though, again, specific language matters). It will make reporting mandatory for youth athletic programs, which is a response to a few recent scandals (though, might also lead to an increase in false reports). It increases protections for child victims and witnesses. Another good thing it does is make it easier for states to set up Internet Crimes Against Children (ICAC) task forces, which specialize in fighting child abuse, and which can be helpful for local law enforcement who are often less experienced in how to deal with such crimes.
The law also expands the reporting requirements for online providers, who are already required to report any CSAM they come across, but this expands that coverage by a bit, and increases the amount of information the sites need to provide. It makes at least some move towards making those reports more useful to law enforcement by authorizing NCMEC to share a copy of an image with local law enforcement from its database.
Considering that, as we keep reporting, the biggest issue with CSAM these days is that law enforcement does so little with the information reported to NCMEC's CyberTipline, hopefully these moves actually help on the one key important area: having law enforcement bring the actual perpetrators to justice and stop them from victimizing children.
But... there remain some pretty serious concerns with the bill. It appears to crack open Section 230, allowing victims" to sue social media companies:
The legislation expands 18 U.S.C. 2255, which currently provides a civil cause of action for victims who suffered sexual abuse or sexual exploitation as children, to enable such victims of to file suit against online platforms and app stores that intentionally, knowingly, recklessly, or negligently promote or facilitate online child sexual exploitation. Victims are able to recover actual damages or liquidated damages in the amount of $150,000, as well as punitive damages and equitable relief. This provision does not apply to actions taken by online platforms to comply with valid legal process or statutory requirements. The legislation specifies that such causes of action are not barred by section 230 of the Communications Act of 1934 (47 U.S.C. 230).
Now, some will argue this shouldn't have a huge impact on big companies that do the right thing because it's only for those that intentionally, knowingly, recklessly, or negligently promote or facilitate" but that's actually a much, much bigger loophole than it might sound at first glance.
First, we've already seen companies that take reporting seriously, such as Reddit and Twitter, get hit with lawsuits making these kinds of allegations. So, plaintiffs' lawyers are going to pile on lawsuits even against the companies that are trying to do their best on this stuff.
Second, even if the sites were doing everything right, now they have to go through the long and arduous process of proving that in every one of these lawsuits. The benefit of Section 230 is to get cases like this kicked out early. Without 230, you have to go through a long and involved process just to prove that you didn't intentionally, knowingly, recklessly, or negligently" do any of those things.
Third, while intentionally" and knowingly" are perhaps more defensible, adding in recklessly" and (even worse) negligently" again just makes every lawsuit a massive crapshoot, because every lawyer is going to argue that any site that doesn't catch and stop every bit of CSAM will be acting negligently." And the lawsuits over negligently are going to be massive and going to be ridiculous and going to be expensive.
So, if you're a social media site - say a mid-sized Mastodon instance - and it's discovered that someone posted CSAM to your site, the victimized individual can sue, and insist that you were negligent in not catching it, even if you were actually good about reporting and removing CSAM.
Basically, this opens up a flood of litigation.
There may also be some concerns about some of the new reporting requirements, in that I fear that (like this very bill misuses the reported" stats as proof that the problem is growing) the new reports will be used down the line to justify more draconian interventions just because the numbers" are going up, when that might just be a result of the reporting itself. I also worry that some of the reporting requirements will lead to further (sketchy) justifications for future attacks on encryption.
Again, this bill has elements that seems good, and would be useful contributions. But the Section 230 carveout is extremely problematic, and it's not at all clear that it would actually help anyone other than plaintiffs lawyers filing a ton of vexatious lawsuits.
On top of all that Durbin's floor speech on introducing the bill was, well, problematic full of moral panic nonsense mostly disconnected from reality and he goes hard against Section 230, though it's not clear he understands it at all. Even worse he talks about how EARN It and STOP CSAM together would lead to a bunch of frivolous lawsuits, which he seems to think is a good thing.
How can this be, you ask? Here ishow. The Communications DecencyAct of 1996-remember that year-contains a section, section 230, that offersnear-total immunity to Big Tech. As aresult, victims like Charlotte have noway to force tech companies to removecontent posted on their sites-not eventhese child sexual abuse horrible images.
My bill, the Stop CSAM Act, is goingto change that. It would protect victims and promote accountability within the tech industry. Companies thatfail to remove CSAM and related imagery after being notified about themwould face significant fines. Theywould also be required to produce annual reports detailing their efforts tokeep children safe from online sexpredators, and any company that promotes or facilitates online child exploitation could face new criminal andcivil penalties.
When section 230 was created in 1996,Mark Zuckerberg was in the sixthgrade. Facebook and social media sitesdidn't even exist. It is time that we rewrite the law to reflect the reality oftoday's world.
A bipartisan bill sponsored by Senators Graham and Blumenthal wouldalso help to do that. It is called theEARN IT Act, and it would let CSAMvictims-these child sexual abuse victims-have their day in court byamending section 230 to eliminate BigTech's near-total immunity from liability and responsibility.
There are serious ways to fight CSAM. But creating massive liability risks and frivolous lawsuits that misunderstand the problem, and don't even deal with the fact that sites already report all this content only to see it disappear into a blackhole without law enforcement doing anything... does not help solve the problem at all.