Article 6KJ0G Confused NY Court Says That Section 230 Doesn’t Block Ridiculous Lawsuit Blaming Social Media For Buffalo Shooter

Confused NY Court Says That Section 230 Doesn’t Block Ridiculous Lawsuit Blaming Social Media For Buffalo Shooter

by
Mike Masnick
from Techdirt on (#6KJ0G)

Can you imagine what kind of world we'd live in if you could blame random media companies for tangential relationships they had with anyone who ever did anything bad? What would happen if we could blame newspapers for inspiring crime? Or television shows for inspiring terrorism? The world would be a much duller place.

We've talked a lot about how the entire purpose of Section 230 of the Communications Decency Act is to put the liability on the right party. That is, it's entirely about making sure the right party is being sued and to avoid wasting everyone's time by suing some random party, especially in pursuit of Steve Dallas" type lawsuits, where you just sue some random company, tangentially connected to some sort of legal violation, because they have the deepest pockets.

295e8553-cffd-4447-8666-4db558c2880f-Rac

Unfortunately, a judge in the NY Supreme Court (which, bizarrely, is NY's lowest level of courts) has allowed just such a lawsuit to move forward. It was filed by the son of a victim of the racist dipshit who went into a Buffalo supermarket and shot and killed a bunch of people a couple years ago. That is, obviously, a very tragic situation. And I can certainly understand the search for someone to blame. But blaming social media" because someone shot up a supermarket is ridiculous.

It's exactly the kind of thing that Section 230 was designed to get tossed out of court quickly.

Of course, NY officials spent months passing the blame and pointing at social media companies. NY's Governor and Attorney General wanted to deflect blame from the state's massive failings in handling the situation. I mean, the shooter had made previous threats, and law enforcement in NY had been alerted to those threats and failed to stop him. Also, he used a high capacity magazine that was illegal in NY and law enforcement failed to stop him. Also, when people in the store called 911, the dispatcher didn't believe them and hung up on them.

The government had lots of failings that aren't being investigated, and lawsuits aren't being filed over those. But, because the shooter also happened to be a racist piece of shit on social media, people want to believe we should be able to magically sue social media.

And, the court is allowing this based on a very incorrect understanding of Section 230. Specifically, the court has bought into the trendy, but ridiculous, product liability" theory, that is now allowing frivolous and vexatious plaintiffs across the country to use this one weird trick" to get around Section 230. Just claim the product was defective" and, boom, the court will let the case go forward.

That's what happened here:

The social media/internet defendants may still prove their platforms were mere message boards and/or do not contain sophisticated algorithms thereby providing them with the protections of the CDA and/or First Amendment. In addition, they may yet establish their platforms are not products or that the negligent design features plaintiff has alleged are not part of their platforms. However, at this stage of the litigation the Court must base its ruling on the allegations of the complaint and not facts" asserted by the defendants in their briefs or during oral argument and those allegations allege viable causes of action under a products liability theory.

Now, some might say no big deal, the court says they can raise this issue again later," but nearly all of the benefit of Section 230 is in how it gets these frivolous cases tossed early. Otherwise, the expense of these cases adds up and creates a real mess (along with the pressure to settle).

Also, the judge here seems very confused. Section 230 does not protect mere message boards." It protects any interactive computer service from being held liable for third-party speech. And whether or not they contain sophisticated algorithms" should have zero bearing on whether or not Section 230 applies.

The Section 230 test to see if it applies is quite simple: (1) Is this an interactive computer service? (2) Would holding them liable in this scenario be holding them liable for the speech of someone else? (3) Does this not fall into any of the limited exceptions to Section 230 (i.e., intellectual property law, trafficking, or federal criminal law)? That's it. Whether or not you're a message board or if you use algorithms has nothing to do with it.

Again, this kind of ruling only encourages more such vexatious litigating.

Outside of Section 230, the social media defendants sought to go to the heart of the matter and just made it clear that there's no causal link between dipshit being a dipshit on social media" and dipshit going on a murderous rampage."

And, again, the judge doesn't seem to much care, saying that a jury can figure that out:

As a general proposition the issue of proximate cause between the defendants' alleged negligence and a plaintiff's injuries is a question of fact for a jury to determine. Oishei v. Gebura 2023 NY Slip Op 05868, 221 AD3d 1529 (4th Dept 2023). Part of the argument is that the criminal acts of the third party, break any causal connection, and therefore causation can be decided as a matter of law. There are limited situations in which the New York Court of Appeals has found intervening third party acts to break the causal link between parties. These instances are where only one conclusion may be drawn from the established facts and where the question of legal cause may be decided as a matter of law." Derdiarian v Felix Contr. Corp., 51 NY2d 308 at 315 (1980). These exceptions involve independent intervening acts that do not flow from the original alleged negligence.

Again, though, getting this kind of case to a jury would be crazy. It would be a massive waste of everyone's resources. By any objective standard, anyone looking at this case would recognize that it is not just frivolous and vexatious, but that it creates really terrible incentives all around.

If these kinds of cases are allowed to continue, you will get more such frivolous lawsuits for anything bad that happens. Worse, you will get much less overall speech online, as websites have incentives to take down or block any speech that isn't Sesame Street-level in terms of how it's portrayed. Any expression of anger, any expression of complaining, any expression of being unhappy about anything could otherwise be seen as proof that the social media was defective in its design" for not magically connecting that expression to future violence.

That would basically be the end of any sort of forum for mental health. It would be the end of review sites. It would be the end of all sorts of useful websites, because the liability that could accrue from just one user on those forums saying something negative would be too much. If just one of the people in those forums then took action in the real world, people could blame it and sue it for not magically stopping the real world violence.

This would be a disaster for the world of online speech.

As Eric Goldman notes, it seems unlikely that this ruling will survive on appeal, but it's still greatly problematic:

I am skeptical this opinion will survive an appeal. The court disregards multiple legal principles to reach an obviously results-driven decision from a judge based in the emotionally distraught community.

The court doesn't cite other cases involving similar facts, including Gibson v. Craigslist and Godwin v. Facebook. One of the ways judges can reach the results they want is by selectively ignoring the precedent, but that approach doesn't comply with the rule of law.

This opinion reinforces how the negligent design" workaround to Section 230 will functionally eliminate Section 230 if courts allow plaintiffs to sue over third-party content by just relabeling their claims.

Separately, I will note my profound disappointment in seeing a variety of folks cheering on this obviously problematic ruling. Chief among them: Free Press. I don't always agree with Free Press on policy prescriptions, but generally, their heart is in the right place on core internet freedom issues. Yet, they put out a press release cheering on this ruling.

In the press release, they claim (ridiculously) that letting this case move forward is a form of accountability" for those killed in the horrific shooting in Buffalo. But that's ridiculous and anyone should recognize that. There are people to hold liable for what happened there: most obviously the shooter himself. But trying to hold random social media sites liable for not somehow stopping future real world violence is beyond silly. As described above, it's also extremely harmful to causes around free speech that Free Press claims as part of its mission.

I'm surprised and disappointed to see them take such a silly stance that undermines its credibility. But I'm even more disappointed in the court for ruling this way.

External Content
Source RSS or Atom Feed
Feed Location https://www.techdirt.com/techdirt_rss.xml
Feed Title Techdirt
Feed Link https://www.techdirt.com/
Reply 0 comments