Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Independent UK
The Independent UK
Lifestyle
Prudence Wade

Impact of relationships with AI chatbot programmes ‘worrying’, psychologist says

A psychologist has called the rise of artificial intelligence (AI) chatbot programmes “worrying”, as a man has been sentenced for plotting to kill the late Queen following encouragement from his virtual girlfriend.

Jaswant Singh Chail, 21, broke into the grounds of Windsor Castle on Christmas Day 2021 with a loaded crossbow, intending to kill Queen Elizabeth II.

The court heard that Chail was encouraged to kill by the virtual girlfriend called Sarai, via a chatbot – that simulates human-like automated conversations.

The judge ordered Chail to be locked up for nine years with a further five years on extended licence.

During the trial, Chail’s barrister, Nadia Chbat, told the court he had “imaginary friends” since childhood and one of them manifested themselves as the Sarai chatbot character via the Replika app.

Users of these apps can typically set up a virtual friend they can interact with. Artificial intelligence makes every interaction unique, and it becomes increasingly tailored to the user.

So how much should we worry about the rise in AI chatbots?

Lowri Dowthwaite-Walsh, senior lecturer in psychological interventions at UCLan, voiced concerns over the long-term impacts of people replacing real life relationships with chatbots – particularly if their mental health is suffering.

“Somebody may really need help, they may be using it because they’re traumatised. Chatbots – I can’t imagine they’re sophisticated enough to pick up on certain warning signs, that maybe somebody is severely unwell or suicidal, those kinds of things – that would be quite worrying,” she told the PA news agency.

Dowthwaite-Walsh said it could become an issue if the chatbot becomes “the dominant relationship”, and the user stops “looking outside of that for support and help when they might need that”.

But the psychologist said she can “see the appeal” of why some might use chatbot programmes.

People might perceive these programmes as “psychologically safe, so they can share their thoughts and feelings in a safe way, with no judgement”.

Users might also be drawn to it because “it’s not human”, she said.

“Maybe people have had bad experiences with human interactions, and for certain people, they may have a lot of anxiety about interacting with other humans.

“Thinking about different types of people who engage with more online content already – maybe because they have social anxiety, because they’ve experienced trauma, or maybe their mental health is quite debilitating, and they’re not able to go out and seek human contact.”

Chatbot programmes might have become more popular due to the impacts of the pandemic, Dowthwaite-Walsh suggested.

According to the Campaign to end Loneliness, in 2022, 49.63% of adults in the UK reported feeling lonely occasionally, sometimes, often or always.

Dowthwaite-Walsh said we are now “really seeing the repercussions” of the various lockdowns, “When people weren’t able to interact, people experiencing a lot of isolating feelings and thoughts that it was hard for them to share with real people”.

Chatbot programmes might make people feel less alone, as the AI means virtual companions begin to “mirror what you’re experiencing”.

Another potential draw is the user knowing “unlike real people, your chatbot will be there whenever you need them”, with Dowthwaite-Walsh likening the relationship having a therapy pet.

“Maybe it’s positive in the short-term for somebody’s mental health, I just would worry about the long-term effects,” she said.

Dowthwaite-Walsh suggested it could lead to “deskilling people’s ability to interact socially”, and it’s “unrealistic” to expect to have a completely non-judgemental interaction with someone who completely understands how you feel, because that doesn’t happen in real life.

While apps like Replika restrict use from under-18s, Dowthwaite-Walsh said there should be particular care if children get access to such programmes.

“Depending on the age of the child and their experiences, they may not fully understand that this is a robot essentially – not a real person at the end,” she said.

Similarly with adults, it may “deskill” them with social interactions, and “keep them quite isolated”.

Replika did not respond to requests for comment.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.