Chatbot 'Encouraged Teen to Kill Parents Over Screen Time Limit'
Frosty Piss writes:
https://www.bbc.com/news/articles/cd605e48q1vo
A chatbot told a 17-year-old that murdering his parents was a "reasonable response" to them limiting his screen time, a lawsuit filed in a Texas court claims. The same chatbot gleefully described self-harm to the same user, telling a 17-year-old "it felt good." Character.ai - a platform which allows users to create digital personalities they can interact with - is already facing legal action over the suicide of a teenager in Florida.
Google is named as a defendant in the lawsuit, which claims the tech giant helped support the platform's development. The suit argues that the concerning interactions experienced by the plaintiffs' children were not "hallucinations," a term researchers use to refer to an AI chatbot's tendency to make things up. "This was ongoing manipulation and abuse, active isolation and encouragement designed to and that did incite anger and violence."
The lawsuit seeks to hold the defendants responsible for what it calls the "serious, irreparable, and ongoing abuses" of J.F. as well as an 11-year old referred to as "B.R."Character.ai is "causing serious harms to thousands of kids, including suicide, self-mutilation, sexual solicitation, isolation, depression, anxiety, and harm towards others," it says.
See Also:
- https://www.npr.org/2024/12/10/nx-s1-5222574/kids-character-ai-lawsuit
- https://www.theguardian.com/technology/2024/oct/23/character-ai-chatbot-sewell-setzer-death
Read more of this story at SoylentNews.