Get all your news in one place.
100’s of premium titles.
One app.
Start reading
International Business Times UK
International Business Times UK
Tanmay Puri

'This is Not Therapy': Microsoft AI CEO Reacts to People Using AI for Breakups, Family Problems

Microsoft AI CEO Mustafa Suleyman warns against treating AI chatbots like therapists. (Credit: Pixabay)

Do you ever use AI like ChatGPT to talk about your relationship problems? Or maybe talk about a breakup you had or even just vent about family problems? Well, you are not alone, AI has put itself into the habits of everyday life in ways most of us could have never ever imagined a few years ago.

AI like this was once not too long ago just limited to academic labs and high end tech conferences, but now AI chatbots such as ChatGPT and Microsoft's Copilot can now be found in pockets, on desktops, and a whole lot in the quiet corners of our emotional and social lives. People are actually turning to these systems not just to solve maths problems or draft emails but to talk about heartbreaks, family disputes and big personal dilemmas.

This shocking trend has caused a giant debate about the role these tools should play in matters of the heart and mind. It is a topic that has drawn the attention of Microsoft AI CEO Mustafa Suleyman, who has very candidly shared his views on the use of AI for emotional support.

How People Are Using AI for Personal Problems Instead of Therapy

This might not be news to you, as you or someone you know must have used ChatGPT to have a long personal talk. Across social media, it has become clear that a huge number of people are using AI chatbots as a stand in for human support. Millennials and Gen Z especially are reported to be turning to AI when they want a non-judgmental, private space to talk through breakups, anxieties, or family disagreements.

Some people describe these interactions as a 'safe space' to understand their thoughts without fear of embarrassment or social consequences. Also, others love the convenience of an always available companion that listens patiently and offers perspective at any hour of the day or night.

Truly, studies actually analyse this trend. Research published in JAMA Network Open found that around one in eight young people in the United States say they use AI chatbots for mental health advice, with even higher numbers among older teens and young adults. The formula for why it works is simple, the accessibility, immediacy and perceived privacy of these tools are some of the main factors driving this use.

Moreover, there are also many personal stories shared in online forums that show the many ways people use AI like ChatGPT for emotional support. Some people report finding immediate comfort after a distressing event or clarity when they were unsure how to handle a conflict. Yes, while some of these stories have positive elements, experts are keen to point out that this does not equate to clinical therapy. AI lacks the training and intuition of a qualified professional, and without that expertise, it cannot reliably assess risk, identify problematic psychological patterns or respond appropriately in a crisis. But even AI like ChatGPT knows that as many of these chat bots advice to contact a real mental health professional after it responds to the query in many cases.

Furthermore, mental health professionals warn that using AI in place of real therapy carries risks. Unlike trained therapists, AI does not have the ability to challenge distorted thinking or recognise harmful behaviour. So, overreliance on AI for emotional processing may also increase feelings of isolation and create unhealthy attachments to a machine that, despite sounding empathetic, does not truly feel or understand.

Read More: 'Cognitive Downgrade' — New York's Smartphone Ban in Schools Raises Massive Problem

Read More: ChatGPT Wants to Remember Everything - OpenAI's Sam Altman Has a Shocking New Goal

Microsoft's AI CEO on the AI Therapy Trend

Mustafa Suleyman, the CEO of Microsoft AI, has weighed in on this trend with a balanced perspective. Speaking on the Breakdown podcast with Mayim Bialik, he acknowledged that people are indeed using AI chatbots to deal with emotional problems, from breakups to familial disagreements. At the same time, he was careful to say that this should not be confused with professional therapy. 'This is not therapy', he said, drawing a clear line between the computational support offered by AI and the complex, human-led practice of psychological treatment.

However, Suleyman did also say that the design of these models, which is to be non-judgmental, respectful and empathetic in their responses, has inadvertently made them very useful for people looking for emotional support. In his view, AI can act as a kind of emotional sounding board, helping users 'detoxify' their thoughts and giving them a way to articulate feelings they might otherwise bottle up. Moreover, he even said that this could have positive benefits, such as making users feel seen and understood in a way that they might struggle to find elsewhere.

However, again, Suleyman also gave a sort of warning. He acknowledged the 'dependency risk' that comes with continuous use of AI for personal support. If people begin to rely on these systems as a main source of emotional guidance, they may find themselves stuck in a cycle where they turn to machines instead of developing real world relationships or seeking professional help when it is truly needed.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.