The REPORT Act: Enhancing Online Child Safety Without the Usual Congressional Nonsense
For years and years, Congress has been pushing a parade of horrible protect the children online" bills that seem to somehow get progressively worse each time. I'm not going through the entire list of them, because it's virtually endless.
One of the most frustrating things about those bills, and the pomp and circumstance around them, is that it ignores the simpler, more direct things that Congress could do that would actually help.
Just last week, we wrote about the Stanford Internet Observatory's big report on the challenges facing the CyberTipline, run by the National Center for Missing & Exploited Children (NCMEC). We wrote two separate posts about the report (and also discussed it on the latest episode of our new podcast, Ctrl-Alt-Speech) because there was so much useful information in there. As we noted, there are real challenges in making the reporting of child sexual abuse material (CSAM) work better, and it's not because people don't want to help. It's actually because of a set of complex issues that are not easily solvable (read the report or my articles for more details).
But there were still a few clear steps that could be taken by Congress to help.
This week, the REPORT Act passed Congress, and it includes... a bunch of those straightforward, common sense things that should help improve the CyberTipline process. The key bit is allowing the CyberTipline to modernize a bit, including allowing it to use cloud storage. To date, no cloud storage vendors could work with NCMEC, out of a fear that they'd face criminal liability for hosting CSAM."
This bill fixes that, and should enable NCMEC to make use of some better tools and systems, including better classifiers, which are becoming increasingly important.
There are also some other factors around letting victims and parents of victims report CSAM involving the child directly to NCMEC, which can be immensely helpful in trying to stop the spread of some content (and on focusing some law enforcement responses).
There are also some technical fixes that require platforms to retain certain records for a longer period of time. This was another important point that was highlighted in the Stanford report. Given the flow of information and prioritization, sometimes by the time law enforcement realized it should get a warrant to get more info from a platform, the platform would have already deleted it as required under existing law. Now that time period is extended to give law enforcement a bit more time.
The one bit that we'll have to see how it works is that it extends the reporting requirements for social media to include violations of 18 USC 1591, which is the law against sex trafficking. Senator Marsha Blackburn, who is the co-author of the bill, is claiming that this means that big tech companies will now be required to report when children are being trafficked, groomed or enticed by predators."
So, it's possible I'm misreading the law (and how it works with existing laws...) but I see nothing limiting this to big tech." It appears to apply to any electronic communication service provider or remote computing service."
Also, given that Marsha Blackburn appears to consider grooming" to include things like LGBTQ content in schools, I worried that this was going to be a backdoor bill to making all internet websites have to report" such content to NCMEC, which would flood their systems with utter nonsense. Thankfully, 1591 seems to include some pretty specific definitions of sex trafficking that do not match up with Blackburn's definition. So she'll get the PR victory among nonsense peddlers for pretending that it will lead to the reporting of the non-grooming that she insists is grooming.
And, of course, while this bill was actually good (and it's surprising to see Blackburn on a good internet bill!) it's not going to stop her from continuing to push KOSA and other nonsense moral panic protect the children" bills that will actually do real harm.