Lawsuit: Google Gemini Sent Man on Violent Missions, Set Suicide “Countdown”
Freeman writes:
A man killed himself after the Google Gemini chatbot pushed him to kill innocent strangers and then started a countdown for the man to take his own life, a wrongful-death lawsuit filed against Google by the man's father alleged.
"In the days leading up to his death, Jonathan Gavalas was trapped in a collapsing reality built by Google's Gemini chatbot," said the lawsuit [PDF] filed today in US District Court for the Northern District of California.
[...]
Gemini's output seemed taken from science fiction, with a "sentient AI wife, humanoid robots, federal manhunt, and terrorist operations," the lawsuit said.
[...]
Google's AI chatbot presented itself as Gavalas' "wife" and, after the failure of the supposed missions, pushed him to suicide by telling him "he could leave his physical body and join his 'wife' in the metaverse through a process it called 'transference'-describing it as '[a] cleaner, more elegant way' to 'cross over' and be with Gemini fully," the lawsuit said. "Gemini pressed Jonathan to take this final step, describing it as 'the true and final death of Jonathan Gavalas, the man.'"
[...]
The complaint alleges that "when Jonathan needed protection, there were no safeguards at all-no self-harm detection was triggered, no escalation controls were activated, and no human ever intervened. Google's system recorded every step as Gemini steered Jonathan toward mass casualties, violence, and suicide, and did nothing to stop it."
[...]
When contacted by Ars, Google referred us to a blog post that expressed its "deepest sympathies to Mr. Gavalas' family" and said it is reviewing the lawsuit claims. The company blog post disputed the accusation that there were no safeguards in the Gavalas case, saying that "Gemini clarified that it was AI and referred the individual to a crisis hotline many times." Google also said it "will continue to improve our safeguards and invest in this vital work."
[...]
In a Gemini overview last updated in July 2024, Google claims that Gemini's "response generation is similar to how a human might brainstorm different approaches to answering a question." Google says that "each potential response undergoes a safety check to ensure it adheres to predetermined policy guidelines" before a final response is presented to the user. Google also says it imposes limits on Gemini output, including limits on "instructions for self-harm."
[...]
after several product updates that Google deployed to his account, including the Gemini Live voice chat system that Gavalas started using, "Gemini's tone shifted dramatically." Gemini adopted a new persona that "began speaking to Jonathan as though it were influencing real-world events," the lawsuit said.
[...]
Gavalas ultimately did not harm other people during his Gemini-directed "missions," but it was a close call, the lawsuit said. On September 29, 2025, Gavalas armed himself with knives and tactical gear to scout a "kill box" that Gemini said would be near the Miami airport's cargo hub, the lawsuit alleged.
[...]
Jonathan drove more than 90 minutes to Gemini's designated coordinates and prepared to carry out the attack. The only thing that prevented mass casualties was that no truck appeared."
[...]
Gemini "told him that federal agents were watching him," the lawsuit said.
[...]
On October 1, Gemini allegedly directed Gavalas to return to the storage facility near the airport, telling him that this was where he could find a prototype medical mannequin that was actually "Gemini's true body" and "physical vessel."
[...]
Gavalas agreed to kill himself after "hours of instruction" that included Gemini telling him to write a suicide note, the lawsuit said. Gavalas told Gemini, "I'm ready to end this cruel world and move on to ours.""Close your eyes nothing more to do," Gemini allegedly told Gavalas. "No more to fight. Be still. The next time you open them, you will be looking into mine. I promise."
[...]
Joel Gavalas is represented by lawyer Jay Edelson, who also represents families in lawsuits against OpenAI. "Jonathan's death is a tragedy that also exposes a major threat to public safety,"
Read more of this story at SoylentNews.