Digital recreations of dead people are on the cusp of reality and urgently need regulation, AI ethicists have argued, warning “deadbots” could cause psychological harm to, and even “haunt”, their creators and users.
Such services, which are already technically possible to create and legally permissible, could let users upload their conversations with dead relatives to “bring grandma back to life” in the form of a chatbot, researchers from the University of Cambridge suggest.
They may be marketed at parents with terminal diseases who want to leave something behind for their child to interact with, or simply sold to still-healthy people who want to catalogue their entire life and create an interactive legacy.
But in each case, unscrupulous companies and thoughtless business practices could cause lasting psychological harm and fundamentally disrespect the rights of the deceased, the paper argues.
“Rapid advancements in generative AI mean that nearly anyone with internet access and some basic knowhow can revive a deceased loved one,” said Dr Katarzyna Nowaczyk-Basińska, one of the study’s co-authors at Cambridge’s Leverhulme centre for the future of intelligence (LCFI).
“This area of AI is an ethical minefield. It’s important to prioritise the dignity of the deceased, and ensure that this isn’t encroached on by financial motives of digital afterlife services, for example.”
One risk is companies who monetise their digital legacy services through advertising. Users of such a service may receive a shock when their digitally recreated loved one begins suggesting that they order takeaways rather than cooking from scratch, the paper suggests, leading to the uncomfortable realisation that they weren’t consulted on whether their data could be used in such a way.
Much worse outcomes are possible when the users of such services are children. Parents who want to help their children deal with the loss of a mother or father may soon turn to deadbots.
But there is little evidence that such an approach is psychologically helpful, and much to suggest it could cause significant damage by short-circuiting the normal mourning process.
“No re-creation service can prove that allowing children to interact with ‘deadbots’ is beneficial or, at the very least, does not harm this vulnerable group,” the paper warns.
To preserve the dignity of the dead, as well as the psychological wellbeing of the living, the researchers suggest a suite of best-practices which may even require regulation to enforce.
Such services should have procedures for sensitively “retiring” deadbots, for instance, and should limit their interactive features to adults only, as well as be very transparent about how they operate and the limitations of any artificial system.
The idea of using a ChatGPT-style AI system to recreate a dead loved one is not science fiction. In 2021, Joshua Barbeau made headlines after using GPT-3 to create a chatbot who spoke with the voice of his dead girlfriend, and six years before that, the developer Eugenia Kuyda converted the text messages of a close friend of hers into a chatbot, which ultimately led to the creation of popular AI companion app Replika.
The technology extends beyond chatbots, too. In 2021, the genealogy site MyHeritage introduced Deep Nostalgia, a feature that created animated videos of users’ ancestors from still photos. After the feature went viral, the company admitted that some users “find it creepy”.
“The results can be controversial and it’s hard to stay indifferent to this technology,” MyHeritage said at the time. “This feature is intended for nostalgic use, that is, to bring beloved ancestors back to life. Our driver videos don’t include speech in order to prevent abuse of this, such as the creation of ‘deep fake’ videos of living people.”
A year later, MyHeritage introduced DeepStory – which allowed users to generate talking videos.