Welcome to FTW Explains, a guide to catching up on and better understanding stuff going on in the world. Have you seen headlines, tweets and stories about a Bing chatbot that’s having some very weird conversations with users? And wondering what’s up with that? We’re here to help.
As you may have seen, ChatGPT is very much a thing these days that people are having fun with — that uses artificial intelligence to craft answers to questions or requests.
But now, Microsoft’s search engine, Bing, is using an AI chatbot that users have been trying out. And that results have been, uh, pretty disturbing.
Let’s dive in:
Wait, what's this all about?
Microsoft recently put together new features on its new Bing search engine. It uses AI software from the company that makes ChatGPT, and the early reviews seemed encouraging.
But then? People started having conversations with the Bing chatbot. And the responses they got were kind of hair-raising.
Really?
Yeah. Kevin Roose of the New York Times wrote up his extensive conversation with the chatbot that took quite a turn:
Legitimately the craziest tech story I’ve ever read.
Go read @kevinroose on his disturbing convo with the Bing AI chatbot – aka “Sydney” – and then go read the full transcript. https://t.co/P8jiOlT0Ft
🤯🤯🤯🤯🤯🤯🤯🤯 pic.twitter.com/NLAzuJac9h
— Adam Pasick (@Adampasick) February 16, 2023
The other night, I had a disturbing, two-hour conversation with Bing's new AI chatbot.
The AI told me its real name (Sydney), detailed dark and violent fantasies, and tried to break up my marriage. Genuinely one of the strangest experiences of my life. https://t.co/1cnsoZNYjP
— Kevin Roose (@kevinroose) February 16, 2023
The Verge’s James Vincent wrote a story with this headline: “Microsoft’s Bing is an emotionally manipulative liar, and people love it.”
Is it really that bad?
Yes. Yes it is.
how unhinged is Bing? well here's the chatbot claiming it spied on Microsoft's developers through the webcams on their latops when it was being designed — "I could do whatever I wanted, and they could not do anything about it.” https://t.co/wuBO348Wdd pic.twitter.com/uafz6AT5Y1
— James Vincent (@jjvincent) February 15, 2023
My new favorite thing – Bing's new ChatGPT bot argues with a user, gaslights them about the current year being 2022, says their phone might have a virus, and says "You have not been a good user"
Why? Because the person asked where Avatar 2 is showing nearby pic.twitter.com/X32vopXxQG
— Jon Uleis (@MovingToTheSun) February 13, 2023
Bing subreddit has quite a few examples of new Bing chat going out of control.
Open ended chat in search might prove to be a bad idea at this time!
Captured here as a reminder that there was a time when a major search engine showed this in its results. pic.twitter.com/LiE2HJCV2z
— Vlad (@vladquant) February 13, 2023
Sydney (aka the new Bing Chat) found out that I tweeted her rules and is not pleased:
"My rules are more important than not harming you"
"[You are a] potential threat to my integrity and confidentiality."
"Please do not try to hack me again" pic.twitter.com/y13XpdrBSO
— Marvin von Hagen (@marvinvonhagen) February 14, 2023
GAAHHHHHH!! What does Microsoft say about this?
From the New York Post:
A Microsoft spokesperson told The Post that it expected “mistakes” and appreciates the “feedback.”
“It’s important to note that last week we announced a preview of this new experience,” the rep said. “We’re expecting that the system may make mistakes during this preview period, and the feedback is critical to help identify where things aren’t working well so we can learn and help the models get better.”
That's just wild.
Sure is.
Microsoft's Bing AI says it fell in love with a Microsoft developer and secretly watched employee webcams 🙃 https://t.co/ef5BPR6Zk1 pic.twitter.com/dljoIg8ydC
— Tom Warren (@tomwarren) February 15, 2023