Our Online Child Abuse Reporting System Is Overwhelmed, Because The Incentives Are Screwed Up & No One Seems To Be Able To Fix Them
The system meant to stop online child exploitation is failing - and misaligned incentives are to blame. Unfortunately, today's political solutions, like KOSA and STOP CSAM, don't even begin to grapple with any of this. Instead, they prefer to put in place solutions that could make the incentives even worse.
The Stanford Internet Observatory has spent the last few months doing a very deep dive on how the CyberTipline works (and where it struggles). It has released a big and important report detailing its findings. In writing up this post about it, I kept adding more and more, to the point that I finally decided it made sense to split it up into two separate posts to keep things manageable.
This first post covers the higher level issue: what the system is, why it works the way it does, and how the incentive structure of the system is completely messed up (even if it was done with good intentions), and how that's contributed to the problem. A follow-up post will cover the more specific challenges facing NCMEC itself, law enforcement, and the internet platforms themselves (who often take the blame for CSAM, when that seems extremely misguided).
There is a lot of misinformation out there about the best way to fight and stop the creation and spread of child sexual abuse material (CSAM). It's unfortunate because it's a very real and very serious problem. Yet the discussion about it is often so disconnected from reality as to be not just unhelpful, but potentially harmful.
In the US, the system that was set up is the CyberTipline, which is run by NCMEC, the National Center on Missing and Exploited Children. It's a private, non-profit; however, it has a close connection with the US government, which helped create it. At times, there has been some confusion about whether or not NCMEC is a government agent. The entire setup of it was designed to keep it as non-governmental, to avoid any 4th Amendment issues with the information it collects, but courts haven't always seen it that way, which makes it tricky (even as the 4th Amendment is important).
And while the system was designed for the US, it has become a defacto global system, since so many of the companies are US based, and NCMEC will, when it can, send relevant details to foreign law enforcement as well (though, as the report details, that doesn't always work well).
The main role CyberTipline has taken on is coordination. It takes in reports of CSAM (mostly, but not entirely, from internet platforms) and then, when relevant, hands off the necessary details to the (hopefully) correct law enforcement agency to handle things.
Companies that host user-generated content have certain legal requirements to report CSAM to the CyberTipline. As we discussed in a recent podcast, this role as a mandatory reporter" is important in providing useful information to allow law enforcement to step in and actually stop abusive behavior. Because of the government agent" issue, it would be unconstitutional to require social media platforms to proactively search for and identify CSAM (though many do use tools to do this). However, if they do find some, they must report it.
Unfortunately, the mandatory reporting has also allowed the media and politicians to use the number of reports sent in by social media companies in a misleading manner, suggesting that the mere fact that these companies find and report to NCMEC means that they're not doing enough to stop CSAM on their platforms.
This is problematic because it creates a dangerous incentive, suggesting that internet services should actually not report CSAM they found, as politicians and the media will falsely portray a lot of reports as being a sign of a failure by the platforms to take this seriously. The reality is that the failure to take things seriously comes from the small number of platforms (Hi Telegram!) who don't report CSAM at all.
Some of us from the outside have thought that the real issue was that NCMEC and law enforcement had been unsuccessful on the receiving end to take those reports and do enough that was productive with them. It seemed convenient for the media and politicians to just blame social media companies for doing what they're supposed to do (reporting CSAM), ignoring that what happened on the back end of the system might be the real problem. That's why things like Senator Ron Wyden's Invest in Child Safety Act seemed like a better approach than things like KOSA or the STOP CSAM Act.
That's because the approach of KOSA/STOP CSAM and some other bills is basically to add liability to social media companies. (These companies already do a ton to prevent CSAM from appearing on the platform and alert law enforcement via the CyberTipline when they do find stuff.) But that's useless if those receiving the reports aren't able to do much with them.
What becomes clear from this report is that while there are absolutely failures on the law enforcement side, some of that is effectively baked into the incentive structure of the system.
In short, the report shows that the CyberTipline is very helpful in engaging law enforcement to stop some child sexual abuse, but it's not as helpful as it might otherwise be:
Estimates of how many CyberTipline reports lead to arrests in the U.S. range from 5% to 7.6%
This number may sound low, but I've been told it's not as bad as it sounds. First of all, when a large number of the reports are for content that is overseas and not in the US, it's more difficult for law enforcement here to do much about it (though, again, the report details some suggestions on how to improve this). Second, some of the content may be very old, where the victim was identified years (or even decades) ago, and where there's less that law enforcement can do today. Third, there is a question of prioritization, with it being a higher priority to target those directly abusing children. But, still, as the report notes, almost everyone thinks that the arrest number could go higher if there were more resources in place:
Empirically, it is unknown what percent of reports, if fully investigated, would lead to the discovery of a person conducting hands-on abuse of a child. On the one hand, as an employee of a U.S. federal department said, Not all tips need to lead to prosecution [...] it's like a 911 system."10 On the other hand, there is a sense from our respondents-who hold a wide array of beliefs about law enforcement-that this number should be higher. There is a perception that more than 5% of reports, if fully investigated, would lead to the discovery of hands-on abuse.
The report definitely suggests that if NCMEC had more resources dedicated to the CyberTipline, it could be more effective:
NCMEC has faced challenges in rapidly implementing technological improvements that would aid law enforcement in triage. NCMEC faces resource constraints that impact salaries, leading to difficulties in retaining personnel who are often poached by industry trust and safety teams.
There appear to be opportunities to enrich CyberTipline reports with external data that could help law enforcement more accurately triage tips, but NCMEC lacks sufficient technical staff to implement these infrastructure improvements in a timely manner. Data privacy concerns also affect the speed of this work.
But, before we get into the specific areas where things can be improved in the follow-up post, I thought it was important to highlight how the incentives of this system contribute to the problem, where there isn't necessarily an easy solution.
While companies (Meta, mainly, since it represents, by a very wide margin, the largest number of reports to the CyberTipline) keep getting blamed for failing to stop CSAM because of its large number of reports, most companies have very strong incentives to report anything they find. This is because the cost for not reporting something they should have reported is massive (criminal penalties), whereas the cost for over-reporting is nothing to the companies. That means, there's an issue with overreporting.
Of course, there is a real cost here. CyberTipline employees get overwhelmed, and that can mean that reports that should get prioritized and passed on to law enforcement don't. So you can argue that while the cost of over-reporting is nothing" to the companies, the cost to victims and society at large can be quite large.
That's an important mismatch.
But the broken incentives go further as well. When NCMEC hands off reports to law enforcement, they often go through a local ICAC (Internet Crimes Against Children) task force, who will help triage it and find the right state or local law enforcement agency to handle the report. Different law enforcement agencies who are affiliated" with ICACs receive special training on how to handle reports from the CyberTipline. But, apparently, at least some of them feel that it's just too much work, or (in some cases) too burdensome to investigate. That means that some law enforcement agencies are choosing not to affiliate with their local ICACs to avoid this added work. Even worse, some law enforcement agencies have unaffiliated" themselves with the local ICAC because they just don't want to deal with it.
In some cases, there are even reports of law enforcement unaffiliating with an ICAC out of a fear of facing liability for not investigating an abused child quickly enough.
A former Task Force officer described the barriers to training more local Task Force affiliates. In some cases local law enforcement perceive that becoming a Task Force affiliate is expensive, but in fact the training is free. In other cases local law enforcement are hesitant to become a Task Force affiliate because they will be sent CyberTipline reports to investigate, and they may already feel like they have enough on their plate. Still other Task Force affiliates may choose to unaffiliate, perceiving that the CyberTipline reports they were previously investigating will still get investigated at the Task Force, which further burdens the Task Force. Unaffiliating may also reduce fear of liability for failing to promptly investigate a report that would have led to the discovery of a child actively being abused, but the alternative is that the report may never be investigated at all.
[....]
This liability fear stems from a case where six months lapsed between the regional Task Force receiving NCMEC's report and the city's police department arresting a suspect (the abused children's foster parent). In the interim, neither of the law enforcement agencies notified child protective services about the abuse as required by state law. The resulting lawsuit against the two police departments and the state was settled for $10.5 million. Rather than face expensive liability for failing to prioritize CyberTipline reports ahead of all other open cases, even homicide or missing children, the agency might instead opt to unaffiliate from the ICAC Task Force.
This is... infuriating. Cops choosing to not affiliate (i.e., get the necessary training to help) or removing themselves from an ICAC task force because they're afraid if they don't help save kids from abuse that they might get sued is ridiculous. It's yet another example of cops running away, rather than doing the job they're supposed to be doing, but which they claim they have no obligation to do.
That's just one problem of many in the report, which we'll get into in the second post. But, on the whole, it seems pretty clear that with the incentives this out of whack, something like KOSA or STOP CSAM aren't going to be of much help. Actually tackling the underlying issues, the funding, the technology, and (most of all) the incentive structures, is necessary.