Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Evening Standard
Evening Standard
World
Saqib Shah

Taylor Swift considering legal action over nude AI deepfakes shared online

Taylor Swift is considering taking legal action after explicit deepfakes of the singer were circulated to millions online on social media.

The nude images, which were created using artificial intelligence, showed the pop star engaging in sexual acts with Kansas City Chiefs players and spectators. Swift has become a regular sight at the American football team’s games since she started dating their star player Travis Kelce.

The Blank Space singer is said to be “furious” about the images and is considering taking legal action against the pornography website hosting them, according to a source close to Swift who spoke to the Daily Mail

This isn’t the first time the superstar has clashed with the site in question. Back in 2011, gossip rag TMZ reported that Swift was threatening to sue the website for wrongly identifying her in a “leaked” nude photo.

Deepfakes are images or videos where faces or sounds are switched out or manipulated. Their most common targets are celebrities and politicians, who are shown to be doing or saying something they never did. 

The rise of readily-available generative AI tools has led to more convincing deepfakes, including more pornography featuring the faces of celebs and even children.

Last October, the UK’s sweeping Online Safety Bill became law, making it illegal to share nonconsensual intimate deepfakes.

The AI-generated images of Swift were also circulated on social media including Reddit, Instagram and X, the platform formerly known as Twitter. 

Snapping into action, Swift’s protective fans have since flooded X with images and videos of the singer, many of which contain the words “Taylor Swift AI” in a bid to bury the unscrupulous pics.

Despite their best efforts, some of the deepfakes are still being uploaded to the platform and are relatively easy to find.

For its part, X has issued a statement saying it’s cracking down on images containing non-consensual nudity.

“We have a zero-tolerance policy towards such content,” the company said in a tweet on Friday morning.

“Our teams are actively removing all identified images and taking appropriate actions against the accounts responsible for posting them. 

“We're closely monitoring the situation to ensure that any further violations are immediately addressed, and the content is removed.”

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.