Florida Presents Its Laughable Appeal For Its Unconstitutional Social Media Content Moderation Law

Now that Texas has signed its unconstitutional social media content moderation bill into law, the action shifts back to Florida's similar law that was already declared unconstitutional in an easy decision by the district court. Florida has filed its opening brief in its appeal before the 11th Circuit and... it's bad. I mean, really, really bad. Embarrassingly bad. I mean, this isn't a huge surprise since their arguments in the district court were also bad. But now that they've had a judge smack them down fairly completely, including in terribly embarrassing oral arguments, you'd think that maybe someone would think to try to lawyer better? Though, I guess, you play with the hand your dealt, and Florida gave its lawyers an unconstitutionally bad hand.
Still, I'd expect at least marginally better lawyering than the kind commonly found on Twitter or in our comments. It starts out bad and gets worse. First off, it claims that it's proven that social media platforms "arbitrarily discriminate against disfavored speakers" and uses a really bad example.
The record in this appeal leaves no question that social media platformsarbitrarily discriminate against disfavored speakers, including speakers in Florida.The record is replete with unrebutted examples of platforms suppressing user contentfor arbitrary reasons. E.g., App.891 (Doc.106-1 at 802) (Facebook censoring TheBabylon Bee, a Florida-based media company, for obviously satirical content).When caught, platforms frequently cast these decisions off as mistakes." E.g.,App.1693 (Doc.106-5 at 201). But systematic examinations show that platformsapply their content standards differently to content and speakers that expressdifferent views but are otherwise similarly situated, all while publicly claiming toapply those standards fairly. See App.999, 1007, 1183 (Doc.106-2 at 14, 22;Doc.106-3 at 17). There are many examples in the Appendix, and even that list ishardly exhaustive.
Except that at scale, tons of mistakes are made, so yes, many of these are mistakes. And others may not be, but it is up to the platform to determine who breaks the rules. But, much more importantly, it is totally within the right of private companies to moderate how they see fit and interpret their own terms of service. So even if there were proof of "discrimination" here (and there is not), it's not against the law.
From there it just gets silly:
Undoubtedly, social media is the modern public square." Packingham v.North Carolina, 137 S. Ct. 1730, 1737 (2017). In S.B. 7072 (the Act")...
Generally speaking, citing Packingham is a demonstration for support of your plan to force private actors to host speech shows you have totally misunderstood Packingham and are either too ignorant or too disingenuous to take seriously. Packingham is about preventing the government from passing laws that remove full internet access from people. It does not mean that any private company has to provide access to anyone.
The argument that Florida's law is not pre-empted by Section 230 is nonsense. Section 230 is clear that no state law can contradict it and do anything to put liability on private website operators (or users) regarding the actions of their users. But that's exactly what Florida's law does.
As the District Courttacitly acknowledged, the only part of that statute that could possibly preempt theAct is Section 230(c)(2). But that provision serves only to absolve platforms ofliability when they remove in good faith content that is objectionable" within themeaning of Section 230(c)(2). That leaves myriad ways in which the Act can applyconsistently with Section 230(c)(2). For example, the Act and Section 230 canpeacefully coexist when a social media platform fails to act in good faith," whenthe Act does not regulate the removal or restriction of content, or when a platformremoves unobjectionable material.
This is disingenuous to downright wrong, and completely ignores the interplay between 230(c)(1) and 230(c)(2) and, notably, the fact that nearly every lawsuit regarding moderation has said that (c)(1) protects all moderation choices, whether or not they are "good faith." And Section 230 clearly also pre-empts any attempt by a state to ignore moderation that is protect by (c)(1). Florida's lawyers just ignore this. Which is kind of stunning. It's not like the lawyers for NetChoice and CCIA are going to ignore it too. And they can point to dozens upon dozens of cases that prove Florida wrong.
The 1st Amendment argument is even worse:
Plaintiffs are also unlikely to succeed on their claim that the Actviolates the First Amendment on its face. Most of the Act is directed at ensuring thatsocial media platforms host content in a transparent fashion. For example, the Actrequires non-controversial, factual disclosures, and disclosure requirements havelong coexisted with the First Amendment. Even the portions of the Act that regulatethe manner in which platforms host speech are consistent with the First Amendment.When properly analyzed separately from the Act's other provisions-and from theextraneous legislative statements on which the District Court primarily relied-theserequirements parallel other hosting regulations that the Supreme Court has held areconsistent with the First Amendment. E.g., Rumsfeld v. FAIR, Inc., 547 U.S. 47, 63(2006). The Act's hosting regulations prevent the platforms from silencing others.They leave platforms free to speak for themselves, create no risk that a user's speechwill be mistakenly attributed to the platforms, and intrude on no unified speechproduct of any platform. These requirements are little different from traditionalregulation of common carriers that has long been thought consistent with the FirstAmendment.
The reliance on Rumsfeld v. FAIR is quite silly, and the few people who have brought it up also tend to look quite silly. This is not even remotely similar to the Rumsfeld situation, which was very narrow and very specific and cannot be extended to apply to an entire social media platform. And to just sort of toss in the idea that social media is a common carrier -- when they do not meet (at all) the classification of a common carrier, and have never been deemed a common carrier -- is just boldly stupid.
There's more, of course, but those are the basics. You never know how a court is going to decide -- and perhaps you get a confused and persuadable judge (there are, unfortunately, a few of those out there). But, this is really weak and seems unlikely to stand.