Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Evening Standard
Evening Standard
World
Saqib Shah

Nightshade AI: The technology that ‘poisons’ images for artificial intelligence bots

The availability of generative artificial intelligence tools that can create text and images which mimic other people’s work has caused an uproar among creatives. 

Famous authors have launched copyright lawsuits against AI giants Meta and OpenAI, actors have gone on strike to protest the rise of digital replicas, and celebs have raised the alarm over their voices being cloned without their consent. 

To help artists fight back against the unwarranted use of their work, a team of researchers has developed a “poisonous” new tech that can be injected into digital art to prevent it from being ingested by AI models.

What is Nightshade?

Fittingly, the new technique is called Nightshade after the family of flowering plants that contains some deadly siblings. In legends and folklore, the infamous shrub was used to off Roman emperors and to turn ancient Greek troops into pigs. Now, it serves as the namesake of a new tool designed to blight AI models that feed off data scraped from the internet. 

The likes of DALL-E, Midjourney, and Stable Diffusion use this mountain of information, including pictures paired with texts or captions, to learn the relationship between textual concepts and visual features. They then rely on an advanced machine learning algorithm known as artificial neural networks to generate art, including photorealistic images.

However, not everyone is pleased about their work being used in this way, especially when it's happening without consent or compensation. The sudden popularity of generative AI poses a dilemma for independent artists: do they keep sharing their work online at the risk of it being copied by AI or do they go private at the risk of less exposure?

Nightshade is among a growing framework of tools that could solve this problem. Whereas some of the existing tech can fool data-hungry AI models, Nightshade takes things a step further by "attacking" them. Because these "poisoned" images are laborious to remove, it could force AI companies to think twice about hoovering up online data.

How does Nightshade work?

Nightshade preys upon a security vulnerability inherent within AI models, which allows them to be tampered with. It does this by adding changes to the pixels in a digital image that are invisible to the human eye. These modifications affect the image itself and the text or captions associated with it, both of which an AI relies on to identify what's in a picture.

If enough infected images are scooped into an AI training set, they can effectively cause it to malfunction. The AI could, for instance, mistake pictures of hats for cakes, and handbags for toasters. In addition, the Nightshade attack can spread by infecting tangentially related images and concepts. For example, if an AI model scraped a poisoned image relating to "fantasy art", other connected images such as "dragon" and “a castle in The Lord of the Rings” would also be confused for something else.

The researchers behind Nightshade tested the attack on Stable Diffusion’s latest models and on an AI model they trained themselves from scratch. After feeding Stable Diffusion 50 poisoned images of dogs, prompts for images of dogs resulted in weird images, including creatures with too many limbs and cartoonish faces. With 300 poisoned samples, Stable Diffusion can be manipulated to generate images of dogs that look like cats. 

Nightshade's creators claim that removing this data is like finding a needle in a haystack due to the immense scale of AI datasets.

Is Nightshade available for use now?

For now, Nightshade is being peer-reviewed by tech boffs to validate the research. The goal is to integrate it into an existing app called Glaze, which works in a similar way by masking the style of an artwork from AI models. While Glaze is free, it's unclear if Nightshade's creators plan to charge for the tech. The team is also making Nightshade open source, so we may end up with multiple versions of it, thereby increasing its power to derail AI systems.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.