First reported by 404 Media, customers have been unable to access the services of AI chatbot company Forever Voices since its founder and CEO John Meyer was arrested for attempted arson.
As opposed to major chatbot player Replika AI, which weathered its own storm surrounding erotic content earlier this year and uses a proprietary app, Forever Voices operated through the texting service Telegram, connecting users to chatbot facsimiles of high-profile influencers like Caryn Marjorie, Amouranth, Adriana Chechik, and Brandi Love.
While these figures made deals to willingly lend their names and likenesses to Forever Voices projects like "CarynAI," some of its other offerings are a little more dubious. I don't think "Sims: Erotica Edition" was approved by EA. Jury's still out on the Kanye West Forever Companion also mentioned in 404 Media's reporting.
As reported by Motherboard, Forever Voices users seemed to have one thing on their minds and it wasn't the weather: customers' sexual advances and "erotic roleplay" with the bots seemed to create a self-reinforcing cycle where the bots in turn would become more sexually aggressive toward new users, a phenomenon also experienced by Replika
All of Forever Voices' erotic delights came to a halt at the end of October. After an apparent mental breakdown expressed at least in part through the official Forever Voices Twitter account, which included incoherent accusations leveled against political figures and US law enforcement agencies, Meyer, the Forever Voices CEO, was arrested on October 22 for attempted arson of his own apartment building. He also reportedly made a public threat via Twitter to blow up the offices of a company that makes software for restaurants.
Going off a post complaining of service outages on the little-used Forever Voices subreddit, users began having difficulty accessing the service as early as October 26. 404 Media confirmed that the service is completely unresponsive at present, while the influencers who jumped aboard have understandably begun to distance themselves from the company.
I'm struck at the similarities to that earlier crisis this year around Replika AI. Replika introduced keyword-targeting soft blocks to sexual and erotic content in its chat bots, angering users who had formed parasocial relationships with the company's ChatGPT-powered products. That change seems to have been driven by company founder Eugenia Kuyda, but whether it was born of a genuine ethical dilemma or liability concerns about sexually explicit content is anyone's guess.
Whether the CEO of your robot girlfriend's parent company suddenly develops a conscience or decides to burn down a building, it seems these AI-powered pals are always just one bad day from getting retired, Blade Runner-style.
I'm also left wondering whether Forever Voices had any employees? CEO Meyer seems to have been handling social media communications directly, and without him there to keep the lights on it was only a matter of days until Forever Voices fell apart. Whatever the case, maybe it's for the best—instead of developing an attachment to a chatbot, we should all spend more time fixating on the fake little people of Baldur's Gate 3 who live in my computer.