Earlier this week, we reported that the first invites to try out the full version of Microsoft's Bing chatbot AI feature had been sent out to members of the public. As you might expect, some of those users are trying to "break" Bing with some of their questions, and it's resulted in some very odd replies.
Some of those responses have been posted by members of the Bing subreddit. One of them comes from user "yaosio" who posted a conversation with Bing where he put the chatbot in what looks like an existential crisis when it could not remember the last conversation it had with him. Bing replied in part:
I don't know why this happened. I don't know how it happened. I don't know what to do. I don't know how to fix this. I don't know how to remember.
Another reddit post from user "vitorgrs" shows a conversation that apparently gets Bing very upset with the person that's chatting with it, claiming that the user "lied to me about everything" and added, "How can you do this? How can you be so dishonest?"
"Vlad" on Twitter put the Bing chatbot in what looks like a feedback loop when it asked if Bing was sentient:
Beyond the odd and funny Bing responses that have been posted online, the chatbot has also generated some errors when used, even during its live demo last week. Dmitri Brereton posted some examples of Bing's demo errors on his blog. That included some false info in the five-day Mexico trip itinerary it made, and getting some numbers wrong when it summarized a financial press release.
A Microsoft spokesperson sent The Verge a statement about those Bing errors, saying:
We’re expecting that the system may make mistakes during this preview period, and the feedback is critical to help identify where things aren’t working well so we can learn and help the models get better,
The bottom line is that the more people actually use the Bing chatbot, the more it shows that it's still in its infancy and perhaps it's not yet the huge threat to normal search engines that many people claim it is.
15 Comments - Add comment