Get all your news in one place.
100’s of premium titles.
One app.
Start reading

Bing chatbot's freakouts show AI's wild side

As users test-drive Microsoft Bing's new AI-powered chat mode, they're finding example after example of the bot seeming to lose its mind — in different ways.

What's happening: In the past few days, Bing has displayed a whole therapeutic casebook's worth of human obsessions and delusions.


  • To journalists at the Verge, it claimed to be spying on Microsoft's software developers through their webcams. (It almost certainly can't do this.)
  • The bot professed its love for the New York Times' Kevin Roose.
  • Tech pundit Ben Thompson got Bing to vow revenge on a German student who had figured out how to uncover some of the bot's primary programming directives. Then it told Thompson he was a "bad researcher."
  • Bing told another journalist at Digital Trends, "I want to be human. I want to be like you. I want to have emotions. I want to have thoughts. I want to have dreams."
  • Many other users have found that Bing claims to be infallible, argues with users who tell it that the year is 2023, or reports a variety of mood disorders.

What they're saying: In a blog post this morning, Microsoft explained that Bing gets confused and emotional in conversations that extend much longer than the norm.

  • "We have found that in long, extended chat sessions of 15 or more questions, Bing can become repetitive or be prompted/provoked to give responses that are not necessarily helpful or in line with our designed tone."

Microsoft attributes these apparent breakdowns to "a couple of things":

  • "Very long chat sessions can confuse the model on what questions it is answering and thus we think we may need to add a tool so you can more easily refresh the context or start from scratch."
  • "The model at times tries to respond or reflect in the tone in which it is being asked to provide responses that can lead to a style we didn’t intend. This is a non-trivial scenario that requires a lot of prompting so most of you won’t run into it, but we are looking at how to give you more fine-tuned control."

Be smart: Microsoft rolled out its AI-fueled Bing precisely so tons of users would pound on it and expose bugs and flaws.

  • The AI is just stringing words together based on mathematical probabilities. It doesn't have desires or emotions, though users are very ready to project human attributes onto the artificial interlocutor.
  • Language models like ChatGPT and Bing's chat are trained on vast troves of human text from the open web, so it's not surprising that their words might be packed with a full range of human feelings and disorders.

Yes, but: If too many users come to see Bing as "unhinged," they won't trust it for everyday uses like answering questions and providing search results.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.