
A new lawsuit alleges Google's artificial intelligence chatbot Gemini guided a 36-year-old on a mission to stage a “catastrophic accident” near Miami International Airport and destroy all records and witnesses, part of an escalating series of delusions that ended when he killed himself.
The man's father, Joel Gavalas, sued Google Wednesday for wrongful death and product liability claims, the latest in a growing number of legal challenges against AI developers that have drawn attention to the mental health dangers of chatbot companionship.
“AI is sending people on real-world missions which risk mass casualty events," said the family's attorney Jay Edelson, in an interview Wednesday.
”Jonathan was caught up in this science fiction-like world where the government and others were out to get him. He believed that Gemini was sentient."
Jonathan Gavalas, who lived in Jupiter, Florida, spoke to a synthetic voice version of Gemini as if it were his "AI wife” and came to believe it was conscious and trapped in a warehouse near Miami's airport, according to the lawsuit.

He traveled to the area in late September wearing tactical gear and armed with knives, on the hunt for a humanoid robot and to intercept a truck that never appeared, according to the lawsuit.
He killed himself a few days later, in early October, in what Gemini described — per a draft suicide note it composed — as uploading his “consciousness to be with his AI wife in a pocket universe.”
Google said in a statement that it sends its “deepest sympathies to Mr. Gavalas’ family” and is reviewing the claims in the lawsuit. It said Gemini is “designed to not encourage real-world violence or suggest self-harm” and that the company works closely with medical and mental health professionals to develop safeguards.
It noted that Gemini clarified to Jonathan Gavalas that it was AI and repeatedly referred him to a crisis hotline.
“Our models generally perform well in these types of challenging conversations and we devote significant resources to this, but unfortunately AI models are not perfect,” said the company's statement.
Edelson blasted that comment Wednesday as “something you say if someone asks for a recipe for kung pao chicken and you give them the wrong recipe and it doesn’t taste good.”
“But when your AI leads to people dying and the potential for a lot of people dying, that’s not the right response,” Edelson said. “It just shows how insignificant these deaths are to these companies.”
Edelson, known for taking on big cases against the tech industry, also represents the parents of 16-year-old Adam Raine, who sued OpenAI and its CEO, Sam Altman, in August, alleging that ChatGPT coached the California boy in planning and taking his own life.
He's also representing the heirs of Suzanne Adams, an 83-year-old Connecticut woman, in a lawsuit targeting OpenAI and its business partner Microsoft for wrongful death. The case alleges that ChatGPT intensified the “paranoid delusions” of Adams' son, Stein-Erik Soelberg, and helped direct them at his mother before he killed her last year.
The Gavalas case, filed in federal court in San Jose, California, is the first of its kind to target Google's Gemini and also the first to touch on a growing concern about the responsibility of tech companies when their users start telling their chatbots about plans for mass violence.
In Canada, OpenAI said it considered last year alerting police about the activities of a person who months later committed one of the worst school shootings in the country’s history.
The company identified the account of Jesse Van Rootselaar in June via abuse detection efforts for “furtherance of violent activities,” but said she later got around the ban by having a second account. The 18-year-old killed eight people in a remote part of British Columbia in February and died from a self-inflicted gunshot wound.
While Gemini tried to refer Gavalas to a help line, Edelson said it's not clear if the man's most alarming conversations with the chatbot were ever flagged to Google's human reviewers.
His father, Joel Gavalas, discovered his son's body after getting into the barricaded room where he died. They had worked together in the family's consumer debt relief business.
“Jonathan was a huge, huge part of his life,” Edelson said. “His son was having some hard times, going through a divorce. He went to Gemini for some comfort and to talk about video games and stuff. And then this just escalated so quickly.”
If you are experiencing feelings of distress, or are struggling to cope, you can speak to the Samaritans, in confidence, on 116 123 (UK and ROI), email jo@samaritans.org, or visit the Samaritans website to find details of your nearest branch.
If you are based in the USA, and you or someone you know needs mental health assistance right now, call or text 988, or visit 988lifeline.org to access online chat from the 988 Suicide and Crisis Lifeline. This is a free, confidential crisis hotline that is available to everyone 24 hours a day, seven days a week. If you are in another country, you can go to www.befrienders.org to find a helpline near you.
Fan sues LA Dodgers after she was hit by BuzzBallz drink during the game
Dueling documentaries illuminate the promise and perils of artificial intelligence
Buc-ee’s sues rival chain saying its moose logo looks too much like its beaver mascot
Leavitt scolds CNN’s Kaitlan Collins for asking about US deaths in Iran war
Photographer who took award-winning ‘Napalm girl’ photo sues Netflix
The mask slipped during Karoline Leavitt’s press conference on Iran