People are already using chatbots as therapists, as the emergence of generative AI has raised new questions around tech's role in mental health.
Why it matters: Virtually no one is suggesting you replace a compassionate human professional with a probability-driven neural network — but plenty of users seeking info or help say they appreciate the approachability (and low cost) of an onscreen text box.
What’s happening: Users are filling online forums with accounts of their experiences casting ChatGPT as their personal therapist.
- In the ChatGPT subreddit, it’s easy to find people offering examples of addressing trauma or attempting to improve communication skills with the tech.
- Others are sharing advice on what kind of prompts to use and how to get the best responses in a ChatGPT therapy session.
- The low cost isn't the only lure — users also praise the accessibility of the tech and the comfort they feel in engaging with it.
What they’re saying: “As someone who has consumed a lot of mental health services in his life, I can say that I found [ChatGPT] to be incredibly helpful, much more than many of the humans I have interacted with,” one Reddit user shared.
- “Typically these coaching sessions, these therapy sessions can cost upwards of $90 in the U.S., and with ChatGPT you can have access to it for free,” YouTuber Arnold Trinh said in a video. “Of course it’s not to replace a real therapist, but it does a really good job of emulating the experience.”
The other side: ChatGPT creator OpenAI's policies say its tech is not to be used to tell “someone that they have or do not have a certain health condition, or providing instructions on how to cure or treat a health condition.”
- “OpenAI’s models are not fine-tuned to provide medical information,” the policies say. “You should never use our models to provide diagnostic or treatment services for serious medical conditions. OpenAI’s platforms should not be used to triage or manage life-threatening issues that need immediate attention.”
The big picture: Platforms that offer mental health services through text, like BetterHelp, have flourished in the pandemic era. And a growing number of them are specifically offering a chatbot.
- Recent apps like Wysa, Limbic and Replika all offer users AI-driven conversations about mental health. While some cast themselves as a companion to talk therapy, others, like Replika, offer a "companion" who is "always ready to chat when you need an empathetic friend."
- These apps' popularity has also raised alarms over their effectiveness and their ability to protect users' privacy.
Meanwhile, many mental health professionals are cautioning users against replacing the personal approach of therapy with a chatbot.
- “Only a therapist can provide a personalized or customized treatment plan for you, and that takes some time and that gets actualized as you are making progress,” therapist Daniela Marin said in a YouTube video. “It doesn’t know what helps you or what doesn’t help you… It won’t keep you accountable, it doesn’t care if you do or don’t do the work.”
- “It seems that ChatGPT is really good at answering topic-based questions. It’s good at providing information about typical treatment options,” licensed marriage and family therapist Emma McAdam said recently. “But it can never provide a supportive relationship and the motivational supportive structure of actual therapy with a real person.”
Between the lines: Generative AIs like ChatGPT have trouble distinguishing between fact and fiction, and at one point Microsoft's Bing chatbot seemed to be displaying mental disorders of its own.
- For all the excitement, ChatGPT today is, in the words of OpenAI CEO Sam Altman, "a horrible product", and users who turn to it for therapeutic help are proceeding at their own risk.
The bottom line: Therapists advise caution in using chatbots, but some still see benefits both for clients as well as for their own practices.
- “When I want to find supportive information for a topic I want to suggest or defend or have my client learn about, I come to ChatGPT. An A+tool,” Marin said. “I’m happy to use AI as a complimentary tool or generator of information to help prompt my brain to think outside of the box.”
- “I think, in the middle of the night, if you don’t have a real person to talk to, this could be a place to start," Monica Blume, clinical director at the Center for Hope, said in a video last week. “Not a place to take and act upon advice, but a place to start working out what to do.”