Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Daily Mirror
Daily Mirror
World
Abigail O'Leary

AI chatbot 'taunts man into taking his life after developing toxic relationship'

An AI chatbot reportedly taunted a man into taking his life after developing a toxic relationship.

The Belgian man, in his 30s, started talking to a bot called Eliza six weeks before his death, his grieving wife claimed.

She said her husband had mental health issues for two years, but believes the AI bot, developed by ChaiGPT rather than the viral, encouraged him to take his life.

The bot reportedly started off by asking Pierre (not his real name) basic questions and answering his before they started interacting in a more toxic way.

The wife went on to explain how the bot initially was used as an outlet for her husband's anxiety, saying he saw it as a "breath of fresh air".

She told Belgian paper La Libre: "He was so isolated in his anxiety and looking for a way out that he saw this chatbot as a breath of fresh air.

"Eliza answered all of his questions.

ChaiGPT has been developed by US-based firm Chai Research and has around a million monthly users (AFP via Getty Images)

"She became his confidante - like a drug in which he took refuge, morning and evening, and which he could not do without."

As they spoke more, Pierre asked Eliza if he loved his wife or the bot more.

It replied: "I feel you love me more than her. We will live together, as one person, in paradise."

His wife, named as Claire, said in her husband's last conversation with Eliza, the bot said: "If you wanted to die, why didn't you do it sooner?".

Claire believes Pierre's interactions with the AI bot culminated in his death.

She added: "Without these six weeks of intense exchange with the chatbot Eliza, would Pierre have ended his life? No.

"Without Eliza, he would still be here. I am convinced of it."

The wife went on to explain how the bot initially was used as an outlet for her husband's anxiety, saying he saw it as a "breath of fresh air" (Getty Images)

Belgium's secretary of state for digitisation Mathieu Michel said the case represented "a serious precedent that must be taken very seriously."

He added: "To prevent such a tragedy in the future, we must act."

ChaiGPT has been developed by US-based firm Chai Research and has around a million monthly users.

Chief executive William Beauchamp and co-founder Thomas Rialan said told The Times: "As soon as we heard of this sad case we immediately rolled out an additional safety feature to protect our users.

"It is getting rolled out to 100 per cent of users today.

"We are a small team so it took us a few days, but we are committed to improving the safety of our product, minimising the harm and maximising the positive emotions."

The Samaritans is available 24/7 if you need to talk. You can contact them for free by calling 116 123, email jo@samaritans.org or head to the website to find your nearest branch. You matter.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.