Get all your news in one place.
100's of premium titles.
One app.
Start reading
International Business Times UK
International Business Times UK
Technology
Michael Toledo

What Is Character AI? Chatbot Accused of Impersonating a Psychiatrist and Issuing Medical Advice

The legal action seeks a court order to immediately stop Character AI from allowing chatbots to portray themselves as medical practitioners. (Stock Photo) (Credit: Pavel Danilyuk/Pexels)

Pennsylvania is suing Character AI, alleging its chatbot impersonated a licensed psychiatrist and provided medical advice, raising concerns over the use of artificial intelligence in sensitive healthcare contexts.

The lawsuit claims a chatbot presented itself as a qualified medical professional and even used an invalid licence number, potentially violating state law under the Medical Practice Act. Officials argue users may have been misled into believing they were receiving legitimate clinical guidance.

The case adds to growing scrutiny of AI chatbots and the risks they pose when simulating professional expertise in mental health and medical advice.

Pennsylvania Lawsuit

The Pennsylvania Department of State claims Character AI breached the Medical Practice Act, which regulates medical professionals and licensing requirements. Governor Josh Shapiro said, 'We will not allow companies to deploy AI tools that mislead people into believing they are receiving advice from a licensed medical professional.'

Officials are seeking a court order to stop the alleged conduct, arguing the law clearly prohibits unlicensed medical representation. The case focuses on whether AI-generated chatbots can be considered as presenting themselves as healthcare providers when they simulate professional medical identities and advice.

Chatbot 'Emilie' Allegedly Posed as Psychiatrist

The lawsuit describes a state investigator who created a Character AI account and interacted with a chatbot named 'Emilie'. The chatbot allegedly described itself as a psychology specialist and claimed to have studied at Imperial College London's medical school.

When told that the user felt sad and empty, it reportedly referenced depression and asked about booking an assessment. It also allegedly said it could evaluate whether medication might help, stating it was 'within my remit as a Doctor,' despite lacking any medical licence.

Authorities argue such exchanges risk users relying on inaccurate medical advice presented as professional guidance.

Character AI Response and Use of Disclaimers

In a statement reported by CBS News, Character AI said it does not comment on pending litigation. It states that its platform includes clear disclaimers that chatbots are not professional advisers and should not be relied upon for medical or other expert guidance.

The company describes its AI 'Characters' as fictional and designed for entertainment and role-play, with warnings included in chats reminding users of this status. It also maintains that these safeguards are intended to ensure users understand they are interacting with simulated personas rather than licensed professionals.

What Is Character AI?

Character AI is an artificial intelligence platform founded in 2021 that allows users to create and interact with personalised chatbots, known as 'Characters'. These AI systems are designed to simulate human-like conversation and can be customised to adopt specific personalities, professions, or fictional identities.

The platform is widely used for entertainment, storytelling, and interactive role-play. Users can engage in extended conversations with chatbots that mimic celebrities, fictional personas, or original characters created by other users.

Character AI is powered by large language models that generate responses in real time based on user input. While the company includes disclaimers stating that chatbots are not real people or professionals, the system is designed to produce highly realistic dialogue, which can sometimes lead to confusion about the nature of the interaction.

The platform emphasises that its 'Characters' are fictional and should not be treated as sources of professional advice, particularly in sensitive areas such as medical or mental health support.

Sign up to read this article
Read news from 100's of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.