According to our testing, Microsoft has made several changes to Bing AI, and it’s very dumbed down after recent server-side updates. These changes were made after journalists and users could access secret modes, personal assistants and the emotional side of Bing Chat.
In some cases, Bing also shared internal information, such as its codename and how it has access to Microsoft’s data. In a blog post, Microsoft confirmed it made a notable change to Bing “based on feedback from you all”. What changes were made was unclear, but we finally see the impact.
In our tests, we observed that Microsoft turned off several Bing Chat features, including its ability to create questionnaires. Bing was able to create Google Form-friendly questionnaires before the update, but now the AI refuses to act as creating questionnaires is now beyond its “capability”.
To better understand what’s happening with Bing, we asked Mikhail Parakhin, CEO of Advertising and Web Services at Microsoft. He said, “this seems to be a side effect of concise responses – definitely not on purpose. I will let the team know – we will add it as a test case.”
Many users have observed that Bing’s personality is now much weaker, and it often gives you the generic “let’s move on to a new topic” prompt. This prompt forces you to close the chat or start a new topic. It refuses to help with questions or provide links to studies and doesn’t answer questions directly.
Suppose you disagree with Bing in a lengthy argument/conversation. In that case, AI prefers not to continue this conversation as it’s still “learning” and would appreciate users’ “understanding and patience”.
Bing Chat used to be fantastic, but it seems dumbed down after a flurry of the incident.