At a press event on Feb. 7, Microsoft (MSFT) demonstrated Bing's new artificial intelligence (AI) capabilities that use OpenAI's ChatGPT technology.
"AI will fundamentally change every software category, starting with the largest category of all -- search," said Satya Nadella, Chairman and CEO, Microsoft. "Today, we’re launching Bing and Edge powered by AI copilot and chat, to help people get more from search and the web."
DON'T MISS: Apple Co-Founder Steve Wozniak Explains Main Reason Why ChatGPT, AI Can Make 'Horrible' Mistakes
The AI-powered Bing integrates a chatbot that can run searches and summarize results. Instead of search results pages with links to sites containing information, users engage with Bing in conversational questions and answers.
It makes jokes and can do creative writing. Its features are intended to make searching the internet more efficient, precise, interactive and faster.
The Microsoft press event occurred the day after Alphabet (GOOGL) said on Feb. 6 it was opening up its new AI technology, called Bard, to public testing. But in an early look at Bard search results, an error appeared with the technology that resulted in investor angst, leading to a selloff of Alphabet shares.
For the first time, momentum in search seemed to moving away from Google and toward Bing.
But a little more than a week later, the new Bing is exhibiting some conduct that has users concerned.
Some high-profile people are even calling for an end to its public testing phase, which is available to a limited number of users.
"Microsoft needs to shut down its implementation of ChatGPT in Bing. The system is behaving psychotically and telling users lies," tweeted social media personality and journalist Ian Miles Cheong on Feb. 16.
"Agreed!" wrote Tesla (TSLA) and Twitter CEO Elon Musk. "It is clearly not safe yet."
Posts on Twitter showed several examples of some erroneous and bizarre behavior with Bing's new technology. Below are just a few examples. The Bing subreddit has more.
"Watch as Sydney/Bing threatens me then deletes its message," wrote @sethlazar.
"My new favorite thing -- Bing's new ChatGPT bot argues with a user, gaslights them about the current year being 2022, says their phone might have a virus, and says 'You have not been a good user.' Why? Because the person asked where Avatar 2 is showing nearby," wrote @MovingToTheSun.
"Bing subreddit has quite a few examples of new Bing chat going out of control," tweeted @vladquant. "Open ended chat in search might prove to be a bad idea at this time! Captured here as a reminder that there was a time when a major search engine showed this in its results."