Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Windows Central
Windows Central
Technology
Kevin Okemwa

Microsoft Copilot blamed by UK police chief for controversial soccer fan ban — incorrect evidence triggered by AI hallucinations

Microsoft Copilot.

Following OpenAI's impressive launch of ChatGPT in November 2022, Microsoft invested billions into the AI research lab and integrated the technology into its tech stack. The company shortly after launched Microsoft Copilot (formerly Bing Chat) to rival OpenAI's offering.

While Microsoft has made elaborate measures to improve Microsoft Copilot's capabilities, there are still incidents where the tool gets out of character by outrightly generating wrong responses or otherwise hallucinating.

In a recent bizarre incident, Copilot hallucinated and made a major mistake about a football intelligence report (via The Verge). Copilot generated information about a non-existent football match between West Ham and Maccabi Tel Aviv.

(Image credit: Getty Images | Robbie Jay Barratt - AMA)

The erroneous report prompted the police to classify the match as "high risk." Consequently, British law enforcers blocked Maccabi Tel Aviv fans from attending the UEFA Europa League match last year on November 6. The move received a lot of backlash and criticism from fans, attracting the attention of Prime Minister Keir Starmer.

The chief constable of West Midlands Police, Craig Guildford, previously denied that the police department used AI to generate the intelligence report, shifting blame to “social media scraping” for the error. But Guildford recently came clean, admitting that:

“On Friday afternoon I became aware that the erroneous result concerning the West Ham v Maccabi Tel Aviv match arose as result of a use of Microsoft Co Pilot [sic].”

He further admitted that Copilot provided wrong information about the football match. And while the game was non-existent, information was used to make critical decisions, including the police barring Maccabi Tel Aviv fans from attending the game against Aston Villa.

Microsoft headquarters in Redmond, Washington. (Image credit: Getty Images | David Ryder)

While speaking to Business Insider, a Microsoft spokesman indicated:

"Copilot combines information from multiple web sources into a single response with linked citations. It informs users they are interacting with an AI system and encourages them to review the sources."

Guildford apologized to the parliamentary committee looking into the bizarre incident for lying that AI wasn't used to generate the intelligence report that informed the police department's decision to bar Maccabi Tel Aviv fans from attending the game:

"I had understood and been advised that the match had been identified by way of a Google search in preparation for attending HAC. My belief that this was the case was honestly held, and there was no intention to mislead the Committee."

When launching Copilot, Microsoft clearly states that "Copilot may make mistakes," but is this enough to avoid such instances in the future?

Can you fully trust AI without doubts that it might lead you astray? Share your thoughts in the comments

Follow Windows Central on Google News to keep our latest news, insights, and features at the top of your feeds!

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.