The first Miriam al-Adib learned of the pictures was when she returned home from a business trip. “Mum,” said her daughter. “I want to show you something.”
The girl, 14, opened her phone to show an explicit image of herself. “It’s a shock when you see it,” said Adib, a gynaecologist in the southern Spanish town of Almendralejo and a mother of four daughters. “The image is completely realistic … If I didn’t know my daughter’s body, I would have thought that image was real.”
It was a deepfake, one of dozens of nude images of schoolgirls in Almendralejo that had been generated by artificial intelligence (AI) and which had been circulating in the town for weeks in a WhatsApp group set up by other schoolchildren.
Some of the girls whose likenesses were being spread were refusing to go to school, suffering panic attacks, being blackmailed and getting bullied in public. “My concern was that these images had reached pornographic sites that we still don’t know about today,” Adib told the Guardian from her clinic in the town.
State prosecutors are considering charges against some of the children,
who created the images using an app downloaded from the internet. But they had been unable to identify the people who developed the app, who prosecutors suspect are based somewhere in eastern Europe, they said.
The Spanish incident flared into global news last year and made Almendralejo, a small town of faded renaissance-era churches and plazas near the Portuguese border, the site of the latest in a series of warning shots from an imminent future where AI tools allow anyone to generate hyper-realistic images with a few clicks.
But while deepfakes of pop stars such as Taylor Swift have generated the most attention, they represent the tip of an iceberg of nonconsensual images that are proliferating across the internet and which police are largely powerless to stop.
As Adib was learning of the pictures, thousands of miles away at the Westfield high school in New Jersey, a strikingly similar case was playing out: many girls targeted by explicit deepfake images generated by students in their classes. The New Jersey incident has prompted a civil lawsuit and helped fuel a bipartisan effort in the US Congress to ban the creation and spread of nonconsensual deepfake images.
At the centre of both the incidents in Spain and New Jersey was the same app, called ClothOff.
In the year since the app was launched, the people running ClothOff have carefully guarded their anonymity, digitally distorting their voices to answer media questions and, in one case, using AI to generate an entirely fake person who they claimed was their CEO.
But a six-month investigation, conducted for a new Guardian podcast series called Black Box, can reveal the names of several people who have done work for ClothOff or who our investigation suggests are linked to the app.
Their trail leads to Belarus and Russia but passes through businesses registered in Europe and front companies based in the heart of London.
ClothOff, whose website receives more than 4m monthly visits, invites users to “undress anyone using AI”. The app can be accessed through a smartphone by clicking a button that confirms the user is over 18, and charges approximately £8.50 for 25 credits.
The credits are used to upload photographs of any woman or girl and return the same image stripped of clothing.
A brother and sister in Belarus
Screenshots seen by the Guardian indicate that a Telegram account in the name of Dasha Babicheva, who social media accounts suggest is in her mid-20s and lives in the Belarus capital, Minsk, has conducted business on ClothOff’s behalf, including discussing applications to banks, changes to the website and business partnerships.
In one screenshot, the account in Babicheva’s name tells a counterpart at another firm that if journalists have questions about ClothOff, “they can contact us on this email”, providing the website’s press contact.
An Instagram account in Babicheva’s name, which shared some of the same images with the Telegram account in her name and which listed the same phone number, was made private after the Guardian started making inquiries, and the phone number was deleted from the profile.
Babicheva did not respond to detailed questions.
Alaiksandr Babichau, 30, identified in social media accounts as Dasha Babicheva’s brother, also appears to be closely linked to ClothOff.
In a recruitment advertisement, ClothOff directed applicants to an email address from the website AI-Imagecraft.
Domain-name records for AI-Imagecraft show the website owner’s name has been hidden at the owner’s request.
But AI-Imagecraft has a virtually identical duplicate website, A-Imagecraft, whose owner has not been hidden: it is listed as Babichau. The Guardian was able to log in to both A-Imagecraft and AI-Imagecraft using the same username and password, indicating the two websites are linked.
There are further links between Babichau and ClothOff. The Guardian has seen screenshots of conversations between ClothOff staff and a potential business partner. The ClothOff staff are identified only by their first names and one of them, identified by another staff member as the “founder”, had the Telegram display name “Al”.
The Guardian compared videos posted to Al’s Telegram account with publicly available footage posted to an account in the name of Alaiksandr Babichau. It showed that both Al and Babichau had uploaded videos and photos showing the same hotel in Macau on 24 January, and from rooms in the same Hong Kong hotel on 26 January. The correlation suggests the two accounts either belong to people who travelled to the cities at the same time, or to the same person.
Reached over the phone last week, Babichau denied any connection to the deepfake app, claimed he did not have a sister named Dasha, and said a Telegram account in his name, that listed his phone number, did not belong to him. In response to further inquiries, he abruptly ended the phone call and has not responded to detailed questions by email.
Shortly after the conversation, the Guardian was blocked by the Telegram account he claimed did not belong to him.
A money trail through London
Payments to ClothOff revealed the lengths the app’s creators have taken to disguise their identities. Transactions led to a company registered in London called Texture Oasis, a firm that claims to sell products for use in architectural and industrial-design projects.
But the company appears to be a fake business designed to disguise payments to ClothOff.
The text on the firm’s website has been copied from the website of another, legitimate, business, as was a list of staff members. When the Guardian contacted one of the people listed as a Texture Oasis employee, he said he had never heard of the business. Our investigation has found no other links between the named staff and ClothOff, adding to the suggestion the staff list has been copied.
The Guardian has also unearthed links between ClothOff and an online video-game marketplace called GGSel, described by its CEO as a way for Russian gamers to circumvent western sanctions.
Both websites briefly listed the same business address last year: a company based in London called GG Technology Ltd, registered to a Ukrainian national named Yevhen Bondarenko. Both websites have since deleted any reference to the firm.
The LinkedIn account in Babichau’s name lists him as a GGSel employee.
Meanwhile, an account in the name of Alexander German, described as a web developer whose LinkedIn says he also works at GGSel, uploaded website code for ClothOff to an account in his name on GitHub, a coding repository. This source code was deleted a short time later.
Reached by the phone number listed on his LinkedIn, someone who identified himself as Alexander German denied he was a web developer or linked in any way to ClothOff.
Several LinkedIn accounts that listed their employment at GGSel on their profiles deleted any reference to the company or removed their surnames and pictures after the Guardian started making inquiries about links between GGSel and ClothOff.
In a statement, GGSel denied any involvement with ClothOff and said it had no connection to GG Technology Ltd, but could not or did not explain why the company was listed on its website as its owner last year. It said neither Babichau nor German had ever been employees and that it would contact LinkedIn to ask them to remove the references from the profiles in their names.
Bondarenko deleted his social media accounts on Wednesday and the Guardian was unable to reach him for comment.
ClothOff said in response to questions that it had no connection with GGSel nor any of those named in this article. A spokesperson claimed it was impossible to use its app to “process” the images of people under the age of 18 but did not specify how or why – nor how images, including of children, were generated by the app in Spain. They speculated the images in New Jersey may have been created using a competitor service.
On Thursday, access to the ClothOff website and app appeared to have been blocked in the UK, but they were still available elsewhere.
The investigation has shown the growing difficulty of distinguishing real people from fake identities that can be accompanied by high-quality photographs, videos and even audio. A fuller account of this story will be published in an episode of Black Box to be released next Thursday.
Additional reporting by Matteo Fagotto, Phil McMahon, Oliver Laughland, Manisha Ganguly, Andrew Roth, Yanina Sorokina and Kateryna Malofieieva.
Do you know more about this story? Contact michael.safi@theguardian.com