Article 6T1WB Take It Down Act Has Best Of Intentions, Worst Of Mechanisms

Take It Down Act Has Best Of Intentions, Worst Of Mechanisms

by
Mike Masnick
from Techdirt on (#6T1WB)
Story Image

You may have heard that the US government has a bit of a mess on its hands after House Speaker Mike Johnson worked out a somewhat carefully crafted compromise continuing resolution funding plan to keep the government open, only to have it collapse after Elon Musk screamed about how bad it was and how anyone who voted for it should be voted out of office.

Lots of very, very stupid people came up with lots of very, very stupid reasons for why the continuing resolution should have been rejected, and I imagine most of them don't realize what it is they're actually asking for, or how much harm it does when the government is shut down. Elon Musk isn't going to suffer if the government is shut down, but lots of others will.

That said, I actually appreciate the underlying message that this is a stupid way to run the government, where Congress has to keep playing chicken with the debt ceiling for a budget that has already been approved, so that blowhards and know-nothings can fight over random shit just to keep the basics of the government functioning properly.

Amidst the recent political wrangling over the continuing resolution to keep the government funded, a controversial bill called the TAKE IT DOWN Act was quietly inserted into the continuing resolution at the last minute (earlier in the week I had been told it wouldn't be included). Sponsored by Senators Ted Cruz and Amy Klobuchar, the bill aims to make it easier for victims of non-consensual intimate imagery (including such imagery generated by online AI tools) to get that content removed from online platforms. While well-intentioned, the bill as currently written raises significant concerns about potential abuse and infringement on free speech.

To be clear, the bill is trying to do something good: enabling people to get non-consensual intimate imagery taken down more easily, with a specific focus on recognizable computer-generated imagery, rather than just actual photographs. But there are significant problems with the methodology here. Even if we agree that the sharing of such imagery is a real problem and should be, at the very least, socially unacceptable, any time you are creating a system to enable content to be taken down under legal threat, you also have to recognize that such a system will inevitably be abused.

And the authors and supporters of TAKE IT DOWN seem to have completely ignored that possibility. It creates a system where someone who claims to be a victim of such sharing can send a notice that effectively requires a website to remove the targeted information.

Upon receiving a valid removal request from an identifiable individual (or an authorized person acting on behalf of such individual) using the process described in paragraph (1)(A)(ii), a covered platform shall, as soon as possible, but not later than 48 hours after receiving such request-

(A) remove the intimate visual depiction; and

(B) make reasonable efforts to identify and remove any known identical copies of such depiction.

The law applies to any online or mobile service that provides a forum for user-generated content, including messages, videos, images, games, and audio files." This means the law would impact not just big social media companies, but also small forums, hobby sites, and any other online community where users can share content.

Those forums would then be required to remove any content if they receive a valid removal request" within 48 hours while also making reasonable efforts to identify and remove any known identical copies of such depiction." What exactly constitutes reasonable efforts" is left vague, but it's not hard to imagine this meaning platforms would have to implement costly and error-prone automated content matching systems. For small sites run by volunteers, that's simply not feasible.

But nothing in the law contemplates false notices. And that's a huge problem. The only current law in the US that has a similar notice and takedown scheme is the DMCA, and, as we've been describing for years, the DMCA's notice-and-takedown provision is widely and repeatedly abused by people who want to takedown perfectly legitimate content.

There have been organized attempts to flood systems with tens of thousands of bogus DMCA notices. A huge 2016 study found that the system is so frequently abused to remove non-infringing works as to question the validity of the entire notice-and-takedown procedure. And that's the DMCA which in theory has a clause that is supposed to punish fraudulent takedown notices (even if that's rarely effective).

Here, the law doesn't even contemplate such a system. Instead, it just assumes all notices will be valid.

On top of that, by requiring covered platforms to identify and remove any known identical copies" suggests that basically every website will have to purchase potentially expensive proactive scanning software that can match images, whether through hashes or otherwise. Yes, Meta and Google can do that kind of thing (and already do!). But the person who runs a local book club forum or a citywide gardening forum isn't going to be able to do that kind of thing.

The folks at the Center for Democracy and Technology (CDT) recently wrote up an analysis of the law that calls out these problems:

The TAKE IT DOWN Act requires covered platforms, as soon as possible but not later than 48 hours after receiving a valid request, to remove reported NDII and to make reasonable efforts to identify and remove any known identical copies of such depictions. Doing so at scale, and in that timeframe, would require the widespread use of automated content detection techniques such as hash matching. Hashes are digital fingerprints" that can be used by platforms to detect known images across their services once the image has been distributed and assists in removal of the identified content if it violates the platform's use policy or the law. Many platforms already use hash matching for known NDII, child sexual abuse material (CSAM), and terrorist and violent extremist content, though none of these processes is currently required by U.S. law. While TAKE IT DOWN does not expressly mandate the use of hash matching, since services already commonly use the technology to identify known-violating content, it would likely be understood to be a reasonable effort to identify and remove" known NDII under the bill.

As currently drafted, however, the TAKE IT DOWN Act raises complex questions implicating the First Amendment that must be addressed before final passage. As a general matter, a government mandate for a platform to take down constitutionally protected speech after receiving notice would be subject to close First Amendment scrutiny. The question is whether a narrowly drawn mandate focused on NDII with appropriate protections could pass muster. Although some NDII falls within a category of speech outside of First Amendment protection such as obscenity or defamation, at least some NDII that would be subject to the Act's takedown provisions, even though unquestionably harmful, is likely protected by the First Amendment. For example, unlike the proposed Act's criminal provisions, the takedown provision would apply to NDII even when it was a matter of public concern. Moreover, the takedown obligation would apply to all reported content upon receipt of notice, before any court has adjudicated whether the reported image constitutes NDII or violates federal law, let alone whether and how the First Amendment may apply. Legally requiring such take-down without a court order implicates the First Amendment.

As CDT notes, at least adding some guardrails" against abuse of the takedown process could help deal with the First Amendment problems of the bill:

To increase the chance of surviving constitutional scrutiny, the takedown provisions in the TAKE IT DOWN Act should be more narrowly tailored and include more guardrails. The Act currently does not include many of the DMCA's guardrails intended to prevent abusive or malicious takedown requests. Even with those guardrails, complainants routinely abuse the DMCA takedown process, leading to the censorship of constitutionally-protected information and criticism. Under current processes, for example, complainants have successfully used the DMCA to take down negative video game reviews, silence parody, and shut down civil society YouTube accounts. The TAKE IT DOWN Act risks repeating this abuse by not expressly exempting commercial pornographic content from the takedown mechanism, only excluding matters of public concern from its criminal prohibitions (but not the takedown mechanism), and not including other protections, such as requiring complainants to attest under penalty of perjury that they are authorized to file a notice on a person's behalf and other appropriate safeguards. While an NDII takedown mechanism should minimize burden on victims, such steps will mitigate the risks of abuse and the removal of content that cannot or should not be restricted from publication under the takedown mechanism.

The rise of AI-powered nudify" apps and similar tools has understandably increased the urgency to address the creation and spread of non-consensual intimate imagery. But as concerning as that problem is, rushed and overly broad legislation like the TAKE IT DOWN Act risks causing its own harms. By failing to include robust safeguards against abuse, this bill would create a sprawling extrajudicial takedown system ripe for exploitation and suppression of legitimate speech.

Cramming such a consequential and constitutionally dubious measure into a must-pass spending bill is a disturbing way to legislate. If Congress truly wants to tackle this issue, it needs to slow down, consider the risks, and craft a narrower solution that doesn't sacrifice crucial free speech protections in the name of expediency. Rushing to regulate away the problem, no matter how well-intentioned, will likely only create new problems, while simultaneously setting the extremely problematic general expectation that for any content Congress disapproves of, it can create laws that require removals.

That's a dangerous road to start down, no matter how noble the initial cause may be.

External Content
Source RSS or Atom Feed
Feed Location https://www.techdirt.com/techdirt_rss.xml
Feed Title Techdirt
Feed Link https://www.techdirt.com/
Reply 0 comments