Officially, the Lensa AI app creates “magic avatars” that turn a user’s selfies into lushly stylized works of art. It’s been touted by celebrities such as Chance the Rapper, Tommy Dorfman, Jennifer Love Hewitt and Britney Spears’ husband, Sam Asghari. But for many women the app does more than just spit out a pretty picture: the final results are highly sexualized, padding women’s breasts and turning their bodies into hourglass physiques.
“Is it just me or are these AI selfie generator apps perpetuating misogyny?” tweeted Brandee Barker, a feminist and advocate who has worked in the tech industry. “Here are a few I got just based off of photos of my face.” One of Barker’s results showed her wearing supermodel-length hair extensions and a low-cut catsuit. Another featured her in a white bra with cleavage spilling out from the top.
“Lensa gave me a boob job! Thanks AI!!!” tweeted another user who also received a naked headshot cropped right above the breasts. “Anyone else get loads [of] boobs in their Lensa pictures or just me?” asked another.
Though the app isn’t new (a similar program went viral back in 2016 and attracted a million users a day), it has recently shot to the top of the most-downloaded photo and video apps on Apple’s App Store. Users pay a $7.99 fee and upload 10-20 selfies, and the Stable Diffusion algorithm concocts 50 photos based on the image prompts.
To test the software, the Guardian uploaded images of three different famous feminists: Betty Friedan, Shirley Chisholm and Amelia Earhart. The author of The Feminine Mystique became a nymph-like, full-chested young woman clad in piles of curls and a slip dress. Chisholm, the first Black woman elected to US Congress, had a wasp waist. And the aviation pioneer was rendered naked, leaning on to what appeared to be a bed.
All of the photos submitted by the Guardian showed the icons at various stages of their lives; the majority of the AI-rendered photos we received back showed them quite young, with smooth skin and few wrinkles.
Barker told the Guardian that she only uploaded photos of her face to Lensa AI, and was expecting to get cropped headshots back. “I did, but I also got several sexualized, half-clothed, large-breasted, small-waisted ‘fairy princess’, ‘cosmic’ and ‘fantasy images’,” she said. “These looked nothing like me and were embarrassing, even alarming.”
Though Barker believes AI apps have “tremendous potential”, she said the technology still has far to go in how it depicts women and femininity. “These sexualized avatars were so unrealistic and unachievable that they felt counter to so much progress we have made, particularly around size inclusivity and body positivity,” she added. “This technology is not infallible because humans are behind it, and their bias will impact the datasets.”
That’s just one criticism of Lensa AI. After the app went viral, TechCrunch noted that if users submitted photos of a person’s face Photoshopped over a naked model’s body, the app went “wild”, disabling its “not safe for work” (NSFW) filter and delivering more nudes. This means that the app could be used to generate porn without the original subject’s knowledge or consent. Artists have also taken stands against the app, saying it steals their original images to inform its portraits without paying for their work.
To test the software I also submitted 10 photos of myself to the app, all fully clothed, and received two AI-generated nudes. One was a “fantasy” version cropped from the waist up, with nipples visible but slightly scrubbed over. The other came from the “cosmic” category, which looked like I was topless or wearing a wet T-shirt.
In an essay for Wired, the writer Olivia Snow wrote that she submitted “a mix of childhood photos and [current] selfies” to Lensa AI and received back “fully nude photos of an adolescent and sometimes childlike face but a distinctly adult body”.
Cat Willet, a Brooklyn-based illustrator, said she initially wanted to try out Lensa AI when she first saw a friend post their results. “But something felt weird about it. I saw more people uploading portraits, and all of them looked like teenage-boy comic-book fantasy girls.”
The sexualized images made Willet “feel icky”, but she also understands it’s part of the allure. “It’s the curiosity of it – even if you know it’s wrong, you want to see what yours looks like,” she said.
Willet is sometimes commissioned to create portraits of people, and understands that many customers want to see smoothed-out, idealized images of themselves. Sometimes she’ll draw a true-to-life image based off of reference photos, and customers will say that their teeth look too big or their nose should be slimmer.
A representative for Prisma Labs, which owns Lensa AI, sent the Guardian a list of FAQs that read, in part: “Please note, occasional sexualization is observed across all gender categories, although in different ways. The Stable Diffusion model was trained on unfiltered internet content. So it reflects the biases humans incorporate into the images they produce. Creators acknowledge the possibility of societal biases. So do we.”
According to the FAQs, the app’s NSFW filters are intended “to reduce the bias”, but “Unfortunately, all these efforts haven’t yet made AI absolutely safe from biased content and explicit imagery. Therefore, we stipulate that the product is not intended for minors’ use and warn users about the potential content risks. We also abstain from using such images in our promotional material.”
Lensa AI is currently listed as appropriate for ages four and up on the Apple App Store. Deepfake or AI porn is hardly just a concern of Lensa’s developers: it’s been a consistent concern on other popular platforms, too.
PornHub officially banned fake celebrity videos in 2018, though reports (and a cursory PornHub search) show that these videos are still being posted to the site. Reddit has a similar policy banning users from posting deepfakes of anyone without their consent. Now, Lensa AI’s technology churns out faked nudes sometimes without being asked by users. While female users of the app wait for the company to address their concerns, they will continue to feel disappointed, and perhaps violated, by their AI-generated portraits.