Article 6QHFV Judge Rejects Yet Another Attempt By Texas To Police Online Speech

Judge Rejects Yet Another Attempt By Texas To Police Online Speech

by
Mike Masnick
from Techdirt on (#6QHFV)
Story Image

Everything's bigger in Texas, including the legislature's willingness to pass laws that clearly violate the First Amendment rights of websites. In the last three years, this is now the third law directed at website moderation practices to be thrown out by a district court as an unconstitutional violation of the First Amendment. You'd think maybe the state's leaders, who claim to be big First Amendment supporters, would recalibrate.

Remember, Texas was one of the earliest states to pass a law that sought to block social media from moderating content, claiming that allowing websites to have such editorial control harmed the free speech rights of citizens. Amusingly, this new law is about the same state of Texas ordering platforms to moderate other content, insisting that mandated takedowns are no violation of the First Amendment at all.

So on the one hand, the Texas state legislature thinks it can tell websites what content they can't take down. And on the other, it thinks it can tell them what content they have to take down. It was wrong both times.

Texas HB 18 is one of a large and growing list of laws seeking to protect the children online," with unconstitutional restrictions. These laws are showing up in red states and blue states and everything in between. This one has a bunch of provisions which require age verification on any social media site, and then blocking of certain types of content for minors, as well as some level of parental controls.

The list of these laws and the challenges to them is growing so long that it's easy to miss some of them. So I had seen that CCIA and NetChoice had challenged Texas HB 18 and had put it on my list of things to write up eventually. However, by the time I got around to it, we'd already had the first major decision in the case, enjoining significant parts of the law as unconstitutional under the First Amendment.

It's a mostly good ruling by Judge Robert Pitman, who had also made an amazingly good ruling three years ago throwing out Texas' other social media content moderation law. The Fifth Circuit then made a total mess of things, leading to the Supreme Court to just recently send the case back, noting how much of a mess the Fifth Circuit had made.

In this case, though, the ruling is a bit more of a mixed bag. It's mostly good in that it calls out the most obviously unconstitutional bits and blocks Texas from enforcing them. But there's more that's maybe a little problematic, as I'll explain at the end.

Also, this case is up against the backdrop of a third bad Texas law, the one requiring age verification for adult content websites. Last year, when that law was challenged, a different judge (Judge David Alan Ezra, who is technically based in Hawaii, but was hearing Texas cases because Texas doesn't have enough judges) pointed out how obviously unconstitutional age verification is. Once again, the Fifth Circuit then made a mess of things, saying that it could ignore multiple precedents and that age verification was fine. The Supreme Court recently agreed to hear that case meaning that at least some part of this law (which has an age verification component) is going to need to wait until the Supreme Court sorts out the previous case.

That said, in a post-Moody world, the Supreme Court has said that any facial challenges to internet regulations must walk through every possible element of the law to determine if the whole thing needs to be thrown out. Thus, Judge Pitman walks through every last bit.

Texas AG Ken Paxton sought to block the case on a bunch of technicalities, but his efforts failed. It's not worth going through the details here other than to note that Paxton challenged associational standing." This is something that Justice Clarence Thomas has been whining about lately, saying that trade associations (like CCIA and NetChoice) shouldn't have standing to bring these challenges. However, as we've explained in great detail, that would be a disaster. Companies are more easy to pressure into not challenging laws, whereas trade groups have a lot more independence.

Also, we have a very long history of trade groups being told they do have standing. This would be a major change, and thankfully Pitman doesn't take the bait.

Then we get to the main show: the First Amendment. Pitman notes that the law clearly impacts speech, and thus must pass strict scrutiny to survive. Paxton tried to claim that, based on the mess the Fifth Circuit made in the original social media law, that strict scrutiny would not apply, or at least not apply to the entire law. And Pitman responds with a hey, did you not notice that the Supreme Court wiped out that ruling"?

In response, Paxton suggests that these arguments are foreclosed by the Fifth Circuit. (Resp., Dkt. 18, at 24 (citing NetChoice, LLC v. Paxton, 49 F.4th 439, 480 (5th Cir. 2022) (NetChoice I"), vacated and remanded sub nom. Moody v. NetChoice, LLC, 144 S. Ct. 2383 (2024))). In NetChoice I, the Fifth Circuit rejected a similar argument brought by Plaintiffs by holding that regulations targeting social media did not render [the law at issue] content-based because the excluded websites are fundamentally dissimilar mediums." NetChoice I, 49 F.4th at 480.

That ruling is no longer binding because the Supreme Court vacated NetChoice I, void[ing] each of the judgment's holdings." Doe v. McKesson, 71 F.4th 278, 286 (5th Cir. 2023); see also Moody, 144 S. Ct. at 2409 (vacating judgment). Paxton suggests that Moody effectively confirmed, or at least did not disturb, the Fifth Circuit's analysis on this point." But the Supreme Court disturbed" the analysis when it vacated the opinion. Paxton suggests that the Supreme Court's own opinion also rebuffed" Plaintiffs theory-but it is not clear how. (See Resp., Dkt. 18, at 24). To the contrary, the Supreme Court expressly stated that there has been enough litigation already to know that the Fifth Circuit, if it had stayed the course, would get wrong at least one significant input . . . . " Moody, 144 S. Ct. at 2349. While the Supreme Court did not determine whether to apply strict or intermediate scrutiny[,]" that was only because Texas's law [did] not pass" either intermediate or strict scrutiny, at least applied to key respects of the law

Pitman points out that courts in Ohio and Mississippi had found problems with similar laws, especially when they treat different kinds of content differently. This law, like the ones in those other states, tried to narrowly mandate controls for social media and explicitly tried to carve out news" sites, which showed that they were discriminatory.

Like the district courts in Yost and Fitch, this Court finds that HB 18 discriminates based on the type of content provided on a medium, not just the type of medium. A DSP that allows users to socially interact with other users but primarily functions to provide" access to news or commerce is unregulated. An identical DSP, with the exact same medium of communication and method of social interaction, but primarily functions to provide" updates on what a user's friends and family are doing (e.g., through Instagram posts and stories), is regulated. If there is a difference between the regulated DSP and unregulated DSP, it is the content of the speech on the site, not the medium through which that speech is presented. When a site chooses not to primarily offer news but instead focus on social engagement, it changes from an uncovered to covered platform. But the type of medium has not changed, only the content primarily expressed on the platform.

In sum, strict scrutiny applies to HB 18's provisions because the law regulates DSPs based on the content of their speech and the identity of the speaker

Because of this, Paxton will need to satisfy strict scrutiny, which means that the law is the least restrictive means of achieving a compelling state interest." Because of the Moody ruling, the court agrees to go provision-by-provision on this question. And thus, the monitoring and filtering" provisions of the law fail as unconstitutional. There's some good language in here, even though the Fifth Circuit will probably wipe it out in a few months.

These requirements force providers to develop strategies to prevent [a] known minor's exposure to harmful material and other content that promotes, glorifies, or facilitates: (1) suicide, self-harm, or eating disorders; (2) substance abuse; (3) stalking, bullying, or harassment; or (4) grooming, trafficking, child pornography, or other sexual exploitation or abuse." HB 18 509.053. Irrespective of whether HB 18 as a whole is content-based, there can be little dispute that this provision is. The monitoring-and-filtering requirements explicitly identify discrete categories of speech and single them out to be filtered and blocked. That is as content based as it gets.

It is far from clear that Texas has a compelling interest in preventing minors' access to every single category of information listed above. Some interests are obvious-no reasonable person could dispute that the state has a compelling interest in preventing minors from accessing information that facilitates child pornography or sexual abuse. See Sable Commc'ns of California, Inc. v. FCC, 492 U.S. 115, 126 (1989) ([T]here is a compelling interest in protecting the physical and psychological well-being of minors."). On the other end, many interests are not compelling, such as regulating content that might advocate for the deregulation of drugs (potentially promoting" substance abuse") or defending the morality of physician-assisted suicide (likely promoting" suicide"). See Brown v. Ent. Merchants Ass'n, 564 U.S. 786, 794-95 (2011) (No doubt a State possesses legitimate power to protect children from harm, but that does not include a free-floating power to restrict the ideas to which children may be exposed.") (internal citation omitted). The Supreme Court has repeatedly emphasized that [s]peech that is neither obscene as to youths nor subject to some other legitimate proscription cannot be suppressed solely to protect the young from ideas or images that a legislative body thinks unsuitable for them." Erznoznik v. Jacksonville, 422 U.S. 205, 213-13 (1975). Much of the regulated topics are simply too vague to even tell if it is compelling. Terms like promoting," glorifying," substance abuse," harassment," and grooming" are undefined, despite their potential wide breadth and politically charged nature. While these regulations may have some compelling applications, the categories are so exceedingly overbroad that such a showing is unlikely.

The judge notes that even if you could make a case that the state has a compelling interest in stopping some of these categories of content, the law is not narrowly tailored enough to meet strict scrutiny.

As in Fitch, Paxton has not shown that the alternative suggested by [Plaintiffs], a regime of providing parents additional information or mechanisms needed to engage in active supervision over children's internet access would be insufficient to secure the State's objective of protecting children." 2024 WL 3276409, at *12. By contrast, Plaintiffs have demonstrated that many DSPs do implement content-moderation policies to ensure that minors cannot access harmful content. (Mot. Prelim. Inj., Dkt. 6, at 22). And Paxton has not shown that methods such as hash-sharing technology" and publishing depictions of filtered content are necessary to prevent harm to minors. In short, HB 18 does not employ the least restrictive means" to stop minors from accessing harmful material. See United States v. Playboy Ent. Grp., 529 U.S. 803, 813 (2000).

HB 18 also employs overbroad terminology. Again, the monitoring-and-filtering requirements impose sweeping ex-ante speech restrictions, akin to prior restraints,12 but does little more than vaguely gesture at what speech must be restrained. For example, what does it mean for content to promote" grooming?" The law is not clear. So, by requiring filtering as a matter of law with only vague reference to what must be filtered, HB 18 will likely filter out far more material than needed to achieve Texas's goal

And then there's the problem that all these laws have. They only cover some sites that host the content Texas finds so problematic.

More problematically, the law is underinclusive. A law that is wildly underinclusive when judged against its asserted justification . . . is alone enough to defeat it." Brown, 564 U.S. at 802. Websites that primarily" produce their own content are exempted, even if they host the same explicitly harmful content such as promoting" eating disorders" or facilitating" self-harm." The most serious problem with HB 18's under-inclusivity is it threatens to censor social discussions of controversial topics. [S]ocial media in particular" operates as one of most important places . . . for the exchange of views . . . ." Packingham, 582 U.S. at 104. But HB 18 specifically cuts teenagers off from this critical democratic forum[] of the Internet" even though the same harmful content is available elsewhere. Reno v. ACLU, 521 U.S. 844, 868 (1997). A teenager can read Peter Singer advocate for physician-assisted suicide in Practical Ethics on Google Books but cannot watch his lectures on YouTube or potentially even review the same book on Goodreads. In its attempt to block children from accessing harmful content, Texas also prohibits minors from participating in the democratic exchange of views online. Even accepting that Texas only wishes to prohibit the most harmful pieces of content, a state cannot pick and choose which categories of protected speech it wishes to block teenagers from discussing online. Brown, 564 U.S. at 794-95.

Pitman also notes that some of the language of the law is so vague as to make it unconstitutional as well:

Begin with the verbs: promote, glorify, and facilitate. One of those words-promote"-has already been held to be vague when regulating First Amendment activity. In Baggett v. Bullitt, 377 U.S. 360, 371-72 (1964), the Supreme Court dealt with a regulation that imposed a loyalty oath for teachers to swear that they will promote respect for the flag and the institutions of the United States." (emphasis added). The Supreme Court found that the term promote" was very wide indeed" and failed to provide[] an ascertainable standard of conduct." Id. In response, Paxton suggests that Baggett dealt with a wildly different situation' than this one." (Resp., Dkt. 18, at 38). But, if anything, the vagueness is more problematic under HB 18, because the law requires social media DSPs to guess which broad categories of speech, likely constituting billions of posts, must be filtered from view. So the wide-ranging meanings of promote" will result in wide-ranging censorship of speech.

The problem is even more acute with the term glorifying." The word encompasses so wide an ambit that people of common intelligence" can do no more than guess at its application. McClelland, 63 F.4th at 1013. To glorify" potentially includes any content that favorably depicts a prohibited topic, leaving no clear answer on what content must be filtered. Do liquor and beer advertisements glorify" substance abuse?" Does Othello glorify" suicide?" Given the substantial liability companies face for failing to comply (to say nothing of the private rights of action), it is reasonable to expect that companies will adopt broad definitions that do encompass such plainly protected speech.

Other parts of the law have definition problems too:

The final issue for HB 18 is that the law fails to define key categories of prohibited topics, including grooming," harassment," and substance abuse." At what point, for example, does alcohol use become substance abuse?" When does an extreme diet cross the line into an eating disorder?" What defines grooming" and harassment?" Under these indefinite meanings, it is easy to see how an attorney general could arbitrarily discriminate in his enforcement of the law. See Smith v. Goguen, 415 U.S. 566, 575 (Statutory language of such a standardless sweep allows [] prosecutors[] and juries to pursue their personal predilections."). These fears are not too distant-pro-LGBTQ content might be especially targeted for grooming." See Little v. Llano Cnty., No. 1:22-CV-424-RP, 2023 WL 2731089, at *2 (W.D. Tex. Mar. 30, 2023) (finding that several books supporting proLGBTQ views were removed from library shelves for allegedly promoting grooming"), aff'd as modified, 103 F.4th 1140 (5th Cir. 2024), reh'g en banc granted, opinion vacated, 106 F.4th 426 (5th Cir. 2024). Content related to marijuana use might be prosecuted as glorifying" substance abuse," even if cigarette and alcohol use is not. This vast indefinite scope of enforcement would effectively grant[] [the State] the discretion to [assign liability] selectively on the basis of the content of the speech." City of Houston, Tex. v. Hill, 482 U.S. 451, 465 n.15 (1987). Such a sweeping grant of censorial power cannot pass First Amendment scrutiny

The court also finds that Section 230 preempts Texas' law. This is an issue we've brought up with many state laws, which the courts have mostly ignored for a few years. Section 230 is clear that it preempts any state law that attaches liability to that which Section 230 immunizes. Pitman points out that this is clearly the case with this law.

Paxton said that the 230 preemption shouldn't apply because the law wouldn't hold platforms liable for third-party content, but rather for just violating the law itself. Judge Pitman points out that this is not how anything works:

Imagine that Texas passed a law stating, Social media websites must remove defamatory content." Under Paxton's broad reading of Free Speech Coalition, the law would not be preempted because liability attaches based on whether a website complies with the law, not based on its content. That reasoning would altogether nullify Section 230 by having the same effect as directly imposing liability on the website for hosting third-party content. Section 230 provides broad immunity" for providers for all claims stemming from their publication of information created by third parties." MySpace, 528 F.3d at 418 (emphasis added). Liability under HB 18 stems from the content it hosts, even if liability directly attaches based on compliance with the law. Accordingly, the Court finds that Section 230 preempts HB 18's monitoring and filtering requirements.

That said, there is still one part of the ruling that is problematic. The judge allows the data privacy, parental control, and disclosure provisions" to go forward, saying that CCIA & NetChoice failed to show how those provisions violate the First Amendment.

It remains possible that each provision will fail under strict scrutiny. But that is not a given. And it is not certain to be the case under HB 18, where many provisions seem to regulate conduct and only incidentally burden speech (if at all). See Moody, 144 S. Ct. at 2402 n.4. Plaintiffs do not show how Section 509.052 places any burden on speech by prohibiting the collection of PII and geolocation data. This is primarily a regulation of conduct, so it is not clear that the law restricts or even burdens speech. Similarly, it is not clear that a law requiring parents to be allowed to access and change their children's privacy settings implicates First Amendment concerns. Overall, these provisions likely primarily regulate conduct, and while the Court can conceive of ways in which they do burden speech (e.g., reducing the hours a child may spend consuming speech on social media), that point is not sufficiently developed at this stage.

Hopefully, this will change with more briefing, as all three of those have serious First Amendment issues associated with them. Parental controls obviously impact the First Amendment rights of children. The disclosure provisions impact issues around compelled speech of platforms, some of which were discussed in the recent Ninth Circuit ruling in the NetChoice v. Bonta decision.

But still, on the whole, this is a good ruling. Now we just need to wait for the Fifth Circuit to mess it all up.

External Content
Source RSS or Atom Feed
Feed Location https://www.techdirt.com/techdirt_rss.xml
Feed Title Techdirt
Feed Link https://www.techdirt.com/
Reply 0 comments