Article 61CHA TikTok Sued Again By Parents Whose Children Killed Themselves Participating In A ‘Blackout Challenge’

TikTok Sued Again By Parents Whose Children Killed Themselves Participating In A ‘Blackout Challenge’

by
Tim Cushing
from Techdirt on (#61CHA)
Story Image

A couple of months ago, the parents of a 10-year-old who died of asphyxiation while allegedly participating" in a blackout challenge" sued TikTok, alleging their child's death was directly related to the social media platform's moderation efforts (or lack thereof) and content recommendation algorithms. The suit, filed in a Pennsylvania federal court, claimed the death had everything to do with TikTok's decision to value profits over user safety. And it attempted to dodge the inevitable Section 230 question by alleging this had nothing to do with the third party content the child had viewed and everything to do with TikTok's handling of, well, third party content.

A similar lawsuit has just been filed by the families of two children who died under similar circumstances.

Eight-year-old Lalani Erika Walton wanted to become TikTok famous." Instead, she wound up dead.

Hers is one of two such tragedies that prompted a linked pair of wrongful death lawsuits filed Friday in Los Angeles County Superior Court against the social media giant. The company's app fed both Lalani and Arriani Jaileen Arroyo, 9, videos associated with a viral trend called the blackout challenge in which participants attempt to choke themselves into unconsciousness, the cases allege; both of the young girls died after trying to join in.

Unlike the May lawsuit, this one [PDF] has been filed in a California county court. But its allegations are pretty much the same being made in a federal court on the other side of the nation. The causes of action are defective, negligence, failure to warn, and -specific to this case - violations of California consumer protection laws.

What's not discussed at all is Section 230 of the CDA, something that might be a bit easier to avoid if the plaintiffs can keep the lawsuit in the county court and the judge focused on alleged consumer law violations. But it's a discussion that's all but inevitable.

While the plaintiffs in both cases focus on defective design, negligence, and other things allegedly traceable to TikTok's moderation efforts and content recommendation engine, the unavoidable fact is that the acts instigating the lawsuits were compelled by content posted by other TikTok users. That is a third party content problem. TikTok's algorithms may have played a part in surfacing harmful content, but its algorithms are nothing without a steady stream of user-generated content and the inputs of TikTok users consuming the generated content.

It's impossible to sue dozens of TikTok users for posting harmful content. Not only that, but it's a losing strategy: the tragedies forming the basis for the lawsuits were the actions of individual users, as difficult as that is to accept. Suing TikTok makes only slightly more sense than attempting to hold TikTok users who've created harmful content responsible for the self-harm their content provoked. But making slightly more sense doesn't put the plaintiffs on the path to courtroom victory.

TikTok may have a wealth of content moderation problems. It may be cutting corners in moderation to ensure maximum profitability. It may have discovered - like so many other platforms - that exponential growth creates content moderation problems that are impossible to solve. And it may very well have promoted harmful content to certain users - not in hopes that they'd harm themselves, but in an effort to extend engagement and retain users. But all of this together does not add up to legal culpability.

Again, what I've said above is not an attempt to blame the victims or their survivors for these tragic deaths. It's very easy to say that parents should have been more involved, especially given the ages of the victims here. But children can often be inscrutable black boxes. Sometimes the only way to discover what should have been done is to examine the evidence after the tragedy has already occurred. And while that may provide some guidance going forward, it does not turn the clock back on the tragedy or make the future easier for families who've lost young children.

Unfortunately, neither will these lawsuits. And it seems unseemly, at best, for law firms to give grieving parents the false hope the court system can provide some kind of payout, much less closure, by suing social media platforms over the actions of their users.

External Content
Source RSS or Atom Feed
Feed Location https://www.techdirt.com/techdirt_rss.xml
Feed Title Techdirt
Feed Link https://www.techdirt.com/
Reply 0 comments