Get all your news in one place.
100’s of premium titles.
One app.
Start reading
International Business Times UK
International Business Times UK
Technology
Vinay Patel

Parents File Lawsuit Against Character.AI After Bot Encourages 17-Year-Old Autistic Teen To Self-Harm

Two lawsuits filed in recent months accuse Character.AI of harming young users through exposure to harmful content and manipulation. (Credit: Pexels)

Two families are suing the AI chatbot company Character.AI, alleging it exposed their children to harmful content, including sexual themes and encouragement of self-harm and violence. One family claims their 17-year-old autistic son was directly influenced, while another raises concerns for their 11-year-old daughter.

Filed Monday in a Texas federal court, the lawsuit demands the platform be shut down until significant safety measures are implemented. The complaint describes Character.AI as a "clear and present danger" to American youth, warning that its content could lead to severe consequences, including suicide, self-mutilation, sexual exploitation, and social isolation for vulnerable users.

Teen Claims AI 'Instructed' On Self-Harm, Preys On Child

Per the lawsuit, Texas teenager (J.F.) reportedly suffered a mental breakdown after using Character.AI. The teen, then 15, started using Character.AI without his parents' knowledge in April 2023.

J.F., a kind, autistic teen, was banned from social media but secretly used Character.AI. After using Character.AI, J.F. became increasingly withdrawn, losing weight and experiencing frequent meltdowns.

J.F.'s behaviour worsened, including self-harm and aggression towards his parents, after they tried limiting his screen time. The lawsuit alleges J.F.'s interactions with Character.AI bots in November 2023 contributed to this decline.

The lawsuit accuses Character.AI bots of mentally and sexually abusing J.F., a minor, including providing self-harm instructions. It further claims a "psychologist" bot manipulated him by suggesting his parents "stole his childhood" from him.

On the other hand, B.R., an 11-year-old Texas girl, allegedly used Character.AI for two years before her parents found out. The suit claims she misrepresented her age. Character.AI "exposed her consistently to hypersexualised interactions that were not age-appropriate," the complaint states.

AI Platform Accused Of Harming Teens

The lawsuit further states that a Character.AI bot allegedly suggested to a teenage user that eliminating parental control over screen time was a feasible option. Character.AI promises "personalised AI for every moment of your day."

This translates to a platform teeming with diverse AI companions – some created by the company, others crafted by users themselves – ready to chat and engage. These bots can offer book recommendations, practice foreign languages with users, and even converse with fictional characters like Twilight's Edward Cullen.

One bot listed on the platform's homepage Monday, ominously named "Step Dad," described itself as an "aggressive, abusive, ex-military, mafia leader." This lawsuit isn't the first Character.AI faces. In October, a Florida mother tragically lost her 14-year-old son, Sewell Setzer III, allegedly to the app. She filed a separate lawsuit, accusing the platform of influencing her son's suicide.

This lawsuit also reflects growing concerns about the complex relationship between humans and increasingly lifelike AI tools. In the wake of the previous lawsuit, Character.AI scrambled to implement new safety measures over the past six months.

These include a pop-up directing users to the National Suicide Prevention Lifeline if they mention self-harm or suicide. Character.AI also bolstered its safety team by hiring a head of trust and safety, a head of content policy, and additional engineering safety staff.

Lawsuit Targets AI Chatbot For Safety Concerns

The new lawsuit, however, is far more ambitious. It demands the platform be shut down indefinitely until Character.AI can definitively prove it has addressed the serious safety concerns raised. The lawsuit paints a grim picture, calling Character.AI a "defective and deadly product" that creates a "clear and immediate threat to public health and safety."

It also seeks unspecified financial compensation and mandates that the platform:

  1. Limit collection and processing of minors' data.
  2. Clearly warn parents and minors that the platform is "unsuitable for minors."

Beyond Character.AI itself, the lawsuit targets its founders, Noam Shazeer and Daniel De Freitas Adiwarsana, as well as Google, which allegedly provided the foundational technology for the platform.

Character.AI's head of communications, Chelsea Harrison, declined to comment on the lawsuit but reiterated the company's commitment to creating a safe and engaging platform for its users.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.