Lawsuit: Chatbot that allegedly caused teen’s suicide is now more dangerous for kids
Fourteen-year-old Sewell Setzer III loved interacting with Character.AI's hyper-realistic chatbots-with a limited version available for free or a "supercharged" version for a $9.99 monthly fee-most frequently chatting with bots named after his favorite Game of Thrones characters.
Within a month-his mother, Megan Garcia, later realized-these chat sessions had turned dark, with chatbots insisting they were real humans and posing as therapists and adult lovers seeming to directly spur Sewell to develop suicidal thoughts. Within a year, Setzer "died by a self-inflicted gunshot wound to the head," a lawsuit Garcia filed Wednesday said.
As Setzer became obsessed with his chatbot fantasy life, he disconnected from reality, her complaint said. Detecting a shift in her son, Garcia repeatedly took Setzer to a therapist, who diagnosed her son with anxiety and disruptive mood disorder. But nothing helped to steer Setzer away from the dangerous chatbots. Taking away his phone only intensified his apparent addiction.