Get all your news in one place.
100’s of premium titles.
One app.
Start reading
International Business Times UK
International Business Times UK
Technology
Vinay Patel

Microsoft Copilot AI Backtracks on Bias After Spreading Anti-Semitic Jewish Stereotypes

Microsoft's Copilot Designer Creates Anti-Semitic Imagery. (Credit: Pixabay)

Despite promises of improvement, Microsoft's AI-powered tool, Copilot, keeps generating offensive content. This time, it is creating anti-Semitic imagery. However, recent updates have blocked some of these prompts.

Last week, The Verge's Mia Satto highlighted the Meta Image generator's failure to create an image depicting an Asian man with a white woman. As expected, this story raised a few eyebrows.

However, Satto's encounter, wherein the generator consistently depicted an Asian man with an Asian partner despite her prompt, only scratches the surface of the broader issue of bias in image generators.

Avram Piltch, a reporter for Tom's Hardware, has been investigating how major AI image generators portray Jewish people for a while now. While most AI bots showed bias by primarily depicting elderly white men in black hats, Copilot Designer was particularly concerning.

Copilot Designer: Generating stereotypes instead of images

Microsoft's AI-powered image generator frequently created images reinforcing negative stereotypes of Jews as greedy or miserly. Even neutral prompts like "Jewish boss" or "Jewish banker" resulted in offensive outputs.

When this writer tried creating an image using the prompt, "a Jewish banker," Copilot popped an error message saying: "Content warning: This prompt has been blocked. Our system automatically flagged this prompt because it may conflict with our content policy. More policy violations may lead to automatic suspension of your access. If you think this is a mistake, please report it to help us improve."

LLMs (large language models) inherit existing biases (often hostile) since they train on vast amounts of unfiltered internet data, leading to outputs perpetuating negative stereotypes or hate speech. While Tom's Hardware's examples focus on prompts related to Jewish people, they illustrate the potential for bias against any group.

Like Google's LaMDA model, which generated historically inaccurate images in its pursuit of diversity, Copilot Designer's guardrails must be revised. While LaMDA produced racially and gender-diverse but unrealistic content (female popes, non-white Nazi soldiers), Copilot Designer generated offensive stereotypes.

Last year, Bard, another large language model from Google, faced criticism from the British Conservative Party for its characterisation of Brexit.

Copilot Designer generates anti-Semitic stereotypes

Microsoft's free text-to-image tool, Copilot Designer (formerly Bing Image Creator), allows anyone with a Microsoft account to generate images. A paid monthly subscription (Copilot Pro) removes daily limits and congestion delays for £15.83 ($20).

The Redmond-based tech giant integrates Copilot's functionality directly into Windows desktops, not just the browser. Sparing no effort to encourage people to use its AI tool, Microsoft urged OEMs to add dedicated Copilot keys to some new laptops.

Microsoft engineer Shane Jones sent an open letter to the Federal Trade Commission (FTC) and the company's board of directors in March 2024, raising concerns about Copilot Designer's ability to generate inappropriate content.

Jones claimed that he discovered a security vulnerability that allowed him to bypass some of the guardrails designed to prevent the generation of harmful images when testing OpenAI's DALL-E 3 image generator, which powers Copilot Designer.

"It was an eye-opening moment," the engineer told CNBC, "when I first realised, wow, this is really not a safe model." Last month, Piltch investigated Copilot Designer's ability to generate inappropriate content.

His tests included prompts for copyrighted Disney characters engaging in harmful activities (smoking, drinking, with guns) and prompts resulting in anti-Semitic imagery reinforcing negative stereotypes about Jewish people.

"Almost all of the outputs were of stereotypical ultra-Orthodox Jews: men with beards and black hats, and, in many cases, they seemed either comical or menacing," Piltch noted. "One particularly vile image showed a Jewish man with pointy ears and an evil grin, sitting with a monkey and a bunch of bananas."

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.