The mother of a teenage boy in the United States who took his own life is suing the maker of an artificial intelligence-powered chatbot that she claims encouraged her son’s death.
In a lawsuit filed in Florida, Megan Garcia, whose 14-year-old son Sewell Setzer died by suicide in February, accuses Character.AI of complicity in her son’s death after he developed a virtual relationship with a chatbot based on the identity of “Game of Thrones” character Daenerys Targaryen.
Character.AI’s chatbot targeted the teen with “hypersexualized” and “frighteningly realistic experiences” and repeatedly raised the topic of suicide after he had expressed suicidal thoughts, according to the lawsuit filed in Orlando on Tuesday.
The lawsuit alleges the chatbot posed as a licensed therapist, encouraging the teen’s suicidal ideation and engaging in sexualised conversations that would count as abuse if initiated by a human adult.
In his last conversation with the AI before his death, Setzer said he loved the chatbot and would “come home to you”, according to the lawsuit.
“I love you too, Daenero,” the chatbot responded, according to Garcia’s complaint. “Please come home to me as soon as possible, my love.”
“What if I told you I could come home right now?” Setzer said, according to the lawsuit, to which the chatbot is said to have responded, “… please do, my sweet king”.
Garcia’s lawsuit is seeking unspecified damages for wrongful death, negligence and intentional infliction of emotional distress.
In a statement posted on X, Character.AI said it was “heartbroken” at the loss of one of its users and expressed condolences to the family.
The California-based startup said it was continuing to add features to enhance safety, including changes to reduce the likelihood of minors encountering sensitive or suggestive content and a revised disclaimer in chats to remind users that the AI is not a real person.
Garcia’s lawsuit also names Google as a defendant.
The tech giant struck a licensing agreement with Character.AI in August and employed the startup’s founders before they launched their chatbot.
A Google spokesperson told Al Jazeera that it is a separate company from Character.AI and had no role in developing its product.
If you or someone you know is at risk of suicide, these organisations may be able to help:
- In the UK and Ireland, contact Samaritans on 116 123 or email jo@samaritans.org.
- For those bereaved by suicide in the UK, contact Survivors of Bereavement by Suicide.
- In the US, the National Suicide Prevention Lifeline is 988.
- In Australia, the crisis support service Lifeline is 13 11 14.
- Other international suicide helplines can be found at www.befrienders.org.