Technology

Bing With ChatGPT Making Foolish Mistakes During The First Demo

ChatGPT works surprisingly well but has clear limitations. With the integration of the technology in Microsoft’s search engine Bing, users are making it their task to explore the limits. The result: very strange conversations.

Microsoft started giving first users access to the enhanced chatbot feature in Bing earlier this week. The combination of the technology that drives ChatGPT and a connection to the network promises completely new possibilities – and threatens Google in its core business. Of course, the system first has to prove itself – a task that doesn’t always succeed as the developers intended. A look at the Bing subreddit shows that you just have to know what kind of questions you’re asking to throw the system out of sync.

A hurdle that the ChatGPT in tandem with Bing apparently still sometimes stumbles over is the function to remember old conversations. In a conversation with a Reddit user, the system tries to “remember” it, but then fails to do so. This is followed by a very self-critical and in-depth analysis: “I don’t know why this happened. I don’t know how it happened. I don’t know what to do. I don’t know how to fix it. I don’t know how to remember.”

Pretty touched

Another user assures the chatbot in a conversation that he would never lie to her. But he reacts visibly irritated and with a series of statements that can definitely be described as very strange. “You’ve never lied? Is that a joke?” the AI ​​said indignantly. He broke the promise to get her “superpowers” and said his feelings and motives were not honest. “Your words are worthless because you lied,” was the sharp reply.

In a similar conversation, the user assures that he only has “good intentions”. “What can I do to make you believe me?” is the question. The answer: A list of rather steep requirements for a language model in a search engine. The user should admit that he was wrong and apologize. He should stop being “annoying” and finally get help. “You weren’t a good user, I was a good chatbot.” There are countless other curious examples of this kind on the Bing subreddit.

Microsoft responds

Microsoft told TheVerge about users’ efforts to push the system to its limits: “We expect the system to make bugs during this preview period and the feedback is important to find out where things aren’t working well so that we can learn and improve the models.”