Section 230 Immunizes TikTok Against Suit Brought By Parent Whose Child Died Participating In A ‘Blackout Challenge’

Earlier this year, the mother of child who died of asphyxiation while participating in the so-called Blackout Challenge" sued TikTok, alleging the company was directly responsible for her 10-year-old daughter's death. The lawsuit claimed this wasn't about third-party content, even though the content that the child allegedly emulated was posted on TikTok. Instead, the lawsuit tried to avoid the obvious Section 230 implications by framing its allegations as intentionally flawed product design.
Plaintiff does not seek to hold the TikTok Defendants liable as the speaker or publisher of third-party content and instead intends to hold the TikTok Defendants responsible for their own independent conduct as the designers, programmers, manufacturers, sellers, and/or distributors of their dangerously defective social media products and for their own independent acts of negligence as further described herein. Thus, Plaintiffs claims fall outside of any potential protections afforded by Section 230(c) of the Communications Decency Act.
TikTok has long been controversial for content its users post. Much of this controversy is manufactured. Someone hears something about a new and potentially dangerous challenge" and pretty soon news broadcasts all over the nation are quoting each other's breathless reporting to turn something few people engaged in into viral" moral panics. According to the lawsuit, this particular challenge" showed up in the 10-year-old's For You" section - an algorithmically sorted list of recommendations, some of which is generated by the user's own interests.
The plaintiff seeking closure via the court system is out of luck, though. It doesn't matter how the allegations are framed. It matters what the allegations actually are. The lawyers representing the child's mother wanted to dodge the Section 230 question because they knew the lawsuit was unwinnable if they met that head on.
The legal dancing is over (at least until the appeal). Section 230 immunity can't be avoided just by trying to turn the algorithmic sorting of user-generated content into some sort of product design flaw. The federal court handling the lawsuit has tossed the suit, citing the very law the plaintiff wanted to keep out of the discussion. (via Law and Crime)
From the decision [PDF]:
Section 230 provides immunity when: (1) the defendant is an interactive computer service provider; (2) the plaintiff seeks to treat the defendant as a publisher or speaker of information; and (3) that information is provided by another content provider. 47 U.S.C. 230(c)(1). Here, the Parties agree that Defendants are interactive computer service providers, and that the Blackout Challenge videos came from another information content provider" (third-party users). They dispute only whether Anderson, by her design defect and failure to warn claims, impermissibly seeks to treat Defendants as the publishers" of those videos. It is evident from the face of Anderson's Complaint that she does.
In addition to that, Anderson wanted TikTok to be treated as a certain kind of publisher, the kind that creates content and publishes it. But there are zero facts to back that claim. Hence the shift of focus to defective design and consumer safety torts under the rationale that it's TikTok's recommendation algorithm that's deliberately and dangerously broken. It doesn't work. TikTok is indeed a publisher, but a publisher of user-created content, which is definitely covered by Section 230. [Emphasis in the original.]
Anderson bases her allegations entirely on Defendants' presentation of dangerous and deadly videos" created by third parties and uploaded by TikTok users. She thus alleges that TikTok and its algorithm recommend inappropriate, dangerous, and deadly videos to users"; are designed to addict users and manipulate them into participating in dangerous and deadly challenges"; are not equipped, programmed with, or developed with the necessary safeguards required to prevent circulation of dangerous and deadly videos"; and [f]ail[] to warn users of the risks associated with dangerous and deadly videos and challenges." (Compl. 107, 127 (emphasis added).) Anderson thus premises her claims on the defective" manner in which Defendants published a third party's dangerous content.
Although Anderson recasts her content claims by attacking Defendants' deliberate action" taken through their algorithm, those actions," however deliberate," are the actions of a publisher. Courts have repeatedly held that such algorithms are not content in and of themselves."
That does it for the lawsuit. The court concludes by reiterating that the lawsuit is about user-generated content, even if it hopes to be perceived as about something else by attacking TikTok's recommendation algorithms. You can argue that TikTok should perform better moderation, especially when recommending content to minors, but you can't argue the tragic death is unrelated to content posted by TikTok users. If immunity is the perceived problem, the court suggests parents stop hiring legal representation and start talking to their elected representation.
Nylah Anderson's death was caused by her attempt to take up the Blackout Challenge." Defendants did not create the Challenge; rather, they made it readily available on their site. Defendants' algorithm was a way to bring the Challenge to the attention of those likely to be most interested in it. In thus promoting the work of others, Defendants published that work-exactly the activity Section 230 shields from liability. The wisdom of conferring such immunity is something properly taken up with Congress, not the courts.
That's the correct judicial take. Unfortunately, there are far too many elected representatives seeking to destroy Section 230 immunity and First Amendment protections for platforms, although most care more about keeping them and their buddies extremely online than about the tragic deaths of impressionable social media users.