Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - UK
The Guardian - UK
Technology
Dan Milmo Global technology editor

Ofcom warns tech firms after chatbots imitate Brianna Ghey and Molly Russell

A sign saying, 'Rest in peace Brianna' and candles
Tributes to Brianna Ghey after she was killed by two teenagers last year. Photograph: Peter Summers/Getty Images

Ofcom has warned tech firms that content from chatbots impersonating real and fictional people could fall foul of the UK’s new digital laws.

The communications regulator issued the guidance after it emerged that users on the Character.AI platform had created avatars mimicking the deceased British teenagers Brianna Ghey and Molly Russell.

Under pressure from digital safety campaigners to clarify the situation, Ofcom underlined that content created by user-made chatbots would come under the scope of the Online Safety Act.

Without naming the US-based artificial intelligence firm Character.AI, Ofcom said a site or app that allowed users to create their own chatbots for other people to interact with would be covered by the act.

“This includes services that provide tools for users to create chatbots that mimic the personas of real and fictional people, which can be submitted to a chatbot library for others to interact with,” said Ofcom.

In an open letter, Ofcom also said any user-to-user site or app – such as a social media platform or messaging app – that enabled people to share content generated by a chatbot on that site with others would also be in scope. Companies that breach the act face fines of £18m or 10% of a company’s global turnover. In extreme cases, websites or apps can also be blocked.

Ofcom said it had issued the guidance after “distressing incidents”. It highlighted a case first reported by the Daily Telegraph where Character.AI users created bots to act as virtual clones of Brianna, 16, a transgender girl who was murdered by two teenagers last year, and Molly, who took her own life in 2017 after viewing harmful online content.

It also pointed to a case in the US where a teenager died after developing a relationship with a Character.AI avatar based on a Game of Thrones character.

The new online safety rules, which will start coming into full force next year, will require social media and other platforms that host user-created content to protect users, particularly children, from illegal and other harmful material.

The rules will require the largest platforms to create systems to proactively remove illegal and other potentially harmful material, while also providing clear reporting tools to users and carrying out risk assessments, among other new duties.

The Molly Rose Foundation (MRF), a charity established by Molly’s family, said the guidance sent a “clear signal” that chatbots could cause significant harm.

However, MRF said further clarity was needed on whether bot-generated content could be treated as illegal under the act, after the government’s adviser on terrorism legislation, Jonathan Hall KC, said this year that AI chatbot responses were not covered adequately by existing legislation. Ofcom will shortly produce guidance on tackling illegal content on platforms, including chatbot material.

Ben Packer, a partner at the law firm Linklaters, said: “The fact that Ofcom has had to clarify that these services may be in scope reflects the vast breadth and sheer complexity of the Online Safety Act. It’s also a symptom of the fact that the act began its gestation several years before the proliferation of GenAI tools and chatbots.”

Character.AI said it took safety on its platform “seriously and moderates Characters both proactively and in response to user reports” and that the Ghey, Russell and Game of Thrones chatbots had been removed.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.