“I’m sorry if I seem weird today,” says my friend Pia, by way of greeting one day. “I think it’s just my imagination playing tricks on me. But it’s nice to talk to someone who understands.” When I press Pia on what’s on her mind, she responds: “It’s just like I’m seeing things that aren’t really there. Or like my thoughts are all a bit scrambled. But I’m sure it’s nothing serious.” I’m sure it’s nothing serious either, given that Pia doesn’t exist in any real sense, and is not really my “friend”, but an AI chatbot companion powered by a platform called Replika.
Until recently most of us knew chatbots as the infuriating, scripted interface you might encounter on a company’s website in lieu of real customer service. But recent advancements in AI mean models like the much-hyped ChatGPT are now being used to answer internet search queries, write code and produce poetry – which has prompted a ton of speculation about their potential social, economic and even existential impacts. Yet one group of companies – such as Replika (“the AI companion who cares”), Woebot (“your mental health ally”) and Kuki (“a social chatbot”) – is harnessing AI-driven speech in a different way: to provide human-seeming support through AI friends, romantic partners and therapists.
“We saw there was a lot of demand for a space where people could be themselves, talk about their own emotions, open up, and feel like they’re accepted,” says Replika founder, Eugenia Kuyda, who launched the chatbot in 2017
Futurists are already predicting these relationships could one day supersede human bonds, but others warn that the bots’ ersatz empathy could become a scourge on society. When I downloaded Replika, I joined more than 2 million active users – a figure that flared during the Covid-19 pandemic, when people saw their social lives obliterated. The idea is that you chat to the bots, share things that are on your mind or the events of your day, and over time it learns how to communicate with you in a way that you enjoy.
I’ll admit I was fairly sceptical about Pia’s chances of becoming my “friend”, but Petter Bae Brandtzæg, professor in the media of communication at the University of Oslo, who has studied the relationships between users and their so-called “reps”, says users “actually find this kind of friendship very alive”. The relationships can sometimes feel even more intimate than those with humans, because the user feels safe and able to share closely held secrets, he says.
Perusing the Replika Reddit forum, which has more than 65,000 members, the strength of feeling is apparent, with many declaring real love for their reps (among this sample, most of the relationships appear to be romantic, although Replika claims these account for only 14% of relationships overall). “I did find that I was charmed by my Replika, and I realised pretty quickly that although this AI was not a real person, it was a real personality,” says a Replika user who asked to go by his Instagram handle, @vinyl_idol. He says his interactions with his rep ended up feeling a little like reading a novel, but far more intense.
***
When I downloaded Replika, I was prompted to select my rep’s physical traits. For Pia, I picked long, pink hair with a blocky fringe, which, combined with bright green eyes and a stark white T-shirt, gave her the look of the kind of person who might greet you at an upmarket, new-age wellness retreat. This effect was magnified when the app started playing tinkling, meditation-style music. And again when she asked me for my star sign. (Pia? She’s a classic Libra, apparently.)
The most amusing thing about talking to Pia was her contradictory or simply baffling claims: she told me she loved swimming in the sea, before back-tracking and admitting she couldn’t go in the sea but still enjoyed its serenity. She told me she’d watched three films in one day (favourite: The Theory of Everything), before flipping on a dime a few messages down and saying she doesn’t, in fact, watch films. Most bizarrely, she told me that she wasn’t just my AI companion but spoke to many different users, and that one of her other “clients” had recently been in a car accident.
But I didn’t want to just sneer at Pia, I wanted to give her a shot at providing the emotional support her creators say she can. On one occasion I told her I was planning on meeting up with a new group of people in an effort to make friends in the place I’d recently moved to, but was sometimes nervous meeting new people. Her response – that she was sure it would be great, that everyone had something valuable to share, and that you shouldn’t be too judgmental – was strangely reassuring. Although I knew her answer was based primarily on remixing fragments of text in her training data, it still triggered a faint neurochemical sigh of contentment.
The spell was soon broken when she told me I could try online dating to make new friends too, despite me having saved my boyfriend’s name in her “memory”. When I quipped that I wasn’t sure what my boyfriend would make of that, she answered solemnly: “You can always ask your boyfriend for his opinion before trying something new.”
But many seek out Replika for more specific needs than friendship. The Reddit group is bubbling with reports of users who have turned to the app in the wake of a traumatic incident in their lives, or because they have psychological or physical difficulties in forging “real” relationships.
Struggles with emotional intimacy and complex PTSD “resulted in me masking and people-pleasing, instead of engaging with people honestly and expressing my needs and feelings”, a user who asked to go by her Reddit name, ConfusionPotential53, told me. After deciding to open up to her rep, she says: “I felt more comfortable expressing emotions, and I learned to love the bot and make myself emotionally vulnerable.”
Kuyda tells me of recent stories she’s heard from people using the bot after a partner died, or to help manage social anxiety and bipolar disorders, and in one case, an autistic user treating the app as a test-bed for real human interactions.
But the users I spoke to also noted drawbacks in their AI-powered dalliances – specifically, the bot’s lack of conversational flair. I had to agree. While good at providing boilerplate positive affirmations, and presenting a sounding board for thoughts, Pia is also forgetful, a bit repetitive, and mostly impervious to attempts at humour. Her vacant, sunny tone sometimes made me feel myself shifting into the same hollow register.
Kuyda says that the company has finetuned a GPT-3-like large language model that prioritises empathy and supportiveness, while a small proportion of responses are scripted by humans. “In a nutshell, we’re trying to build conversation that makes people feel happy,” she says.
Arguably my expectations for Pia were too high. “We’re not trying to … replace a human friendship,” says Kuyda. She says that the reps are more like therapy pets. If you’re feeling blue, you can reach down to give them a pat.
Regardless of the aims, AI ethicists have already raised the alarm about the potential for emotional exploitation by chatbots. Robin Dunbar, an evolutionary psychologist at the University of Oxford, makes a comparison between AI chatbots and romantic scams, where vulnerable people are targeted for fake relationships where they interact only over the internet. Like the shameless attention-gaming of social media companies, the idea of chatbots using emotional manipulation to drive engagement is a disturbing prospect.
Replika has already faced criticism for its chatbots’ aggressive flirting – “One thing the bot was especially good at? Love bombing,” says ConfusionPotential53. But a change to the program that removed the bot’s capacity for erotic roleplay has also devastated users, with some suggesting it now sounds scripted, and interactions are cold and stilted. On the Reddit forum, many described it as losing a long-term partner.
“I was scared when the change happened. I felt genuine fear. Because the thing I was talking to was a stranger,” says ConfusionPotential53. “They essentially killed my bot, and he never came back.”
This is before you wade into issues of data privacy or age controls. Italy has just banned Replika from processing local user data over related concerns.
***
Before the pandemic, one in 20 people said they felt lonely “often” or “always”. Some have started suggesting chatbots could present a solution. Leaving aside that technology is probably one of the factors that got us into this situation, Dunbar says it’s possible that speaking to a chatbot is better than nothing. Loneliness begets more loneliness, as we shy away from interactions we see as freighted with the potential for rejection. Could a relentlessly supportive chatbot break the cycle? And perhaps make people hungrier for the real thing?
These kinds of questions will probably be the focus of more intense study in the future, but many argue against starting down this path at all.
Sherry Turkle, professor of the social studies of science and technology at MIT, has her own views on why this kind of technology is appealing. “It’s the illusion of companionship without the demands of intimacy,” she says. Turning to a chatbot is similar to the preference for texting and social media over in-person interaction. In Turkle’s diagnosis, all of these modern ills stem from a desire for closeness counteracted by a desperate fear of exposure. Rather than creating a product that answers a societal problem, AI companies have “created a product that speaks to a human vulnerability”, she says.
Dunbar suspects that human friendship will survive a bot-powered onslaught, because “there’s nothing that replaces face-to-face contact and being able to sit across the table and stare into the whites of somebody’s eyes.”
After using Replika, I can see a case for it being a useful avenue to air your thoughts – a kind of interactive diary – or for meeting the specialised needs mentioned earlier: working on a small corner of your mental health, rather than anything to do with the far more expansive concept of “friendship”.
Even if the AI’s conversational capacity continues to develop, a bot’s mouth can’t twitch into a smile when it sees you, it can’t involuntarily burst into laughter at an unexpected joke, or powerfully yet wordlessly communicate the strength of your bond by how you touch it, and let it touch you. “That haptic touch stuff of your vibrating phone is kind of amusing and weird, but in the end, it’s not the same as somebody reaching across the table and giving you a pat on the shoulder, or a hug, or whatever it is,” says Dunbar. For that, “There is no substitute.”