Snapchat Isn't Liable For Connecting 12-Year-Old To Convicted Sex Offenders
An anonymous reader quotes a report from Ars Technica: A judge has dismissed (PDF) a complaint from a parent and guardian of a girl, now 15, who was sexually assaulted when she was 12 years old after Snapchat recommended that she connect with convicted sex offenders. According to the court filing, the abuse that the girl, C.O., experienced on Snapchat happened soon after she signed up for the app in 2019. Through its "Quick Add" feature, Snapchat "directed her" to connect with "a registered sex offender using the profile name JASONMORGAN5660." After a little more than a week on the app, C.O. was bombarded with inappropriate images and subjected to sextortion and threats before the adult user pressured her to meet up, then raped her. Cops arrested the adult user the next day, resulting in his incarceration, but his Snapchat account remained active for three years despite reports of harassment, the complaint alleged. Two years later, at 14, C.O. connected with another convicted sex offender on Snapchat, a former police officer who offered to give C.O. a ride to school and then sexually assaulted her. The second offender is also currently incarcerated, the judge's opinion noted. The lawsuit painted a picture of Snapchat's ongoing neglect of minors it knows are being targeted by sexual predators. Prior to C.O.'s attacks, both adult users sent and requested sexually explicit photos, seemingly without the app detecting any child sexual abuse materials exchanged on the platform. C.O. had previously reported other adult accounts sending her photos of male genitals, but Snapchat allegedly "did nothing to block these individuals from sending her inappropriate photographs." Among other complaints, C.O.'s lawsuit alleged that Snapchat's algorithm for its "Quick Add" feature was the problem. It allegedly recklessly works to detect when adult accounts are seeking to connect with young girls and, by design, sends more young girls their way -- continually directing sexual predators toward vulnerable targets. Snapchat is allegedly aware of these abuses and, therefore, should be held liable for harm caused to C.O., the lawsuit argued. Although C.O.'s case raised difficult questions, Judge Barbara Bellis ultimately agreed with Snapchat that Section 230 of the Communications Decency Act barred all claims and shielded Snap because "the allegations of this case fall squarely within the ambit of the immunity afforded to" platforms publishing third-party content. According to Bellis, C.O.'s family had "clearly alleged" that Snap had failed to design its recommendations systems to block young girls from receiving messages from sexual predators. Specifically, Section 230 immunity shields Snap from liability in this case because Bellis considered the messages exchanged to be third-party content. Snapchat designing its recommendation systems to deliver content is a protected activity, Bellis ruled. Despite a seemingly conflicting ruling in Los Angeles that found that "Section 230 didn't protect Snapchat from liability for allegedly connecting teens with drug dealers," Bellis didn't appear to consider it persuasive. She did, however, critique Section 230's broad application, suggesting courts are limited without legislative changes, despite the morally challenging nature of some cases.
Read more of this story at Slashdot.