NIX Solutions: Microsoft “Lobotomized” its Bing Chatbot

Microsoft’s new AI chatbot, codenamed Sydney, which is still in closed testing, has made headlines in recent weeks for its unexpected responses and interactions with users. However, Microsoft recently severely “cut” the emotional component of the bot. Now it does not ask questions of its origin and existence, and it can no longer threaten users or confess its love to them.

NIXsolutions

During the first week of testing Bing Chat, users noticed that the bot would gradually “get frustrated” when conversations with it got too long. As a result, Microsoft has introduced a user limit of 50 messages per day and 5 requests per conversation. In addition, Bing Chat will no longer talk about itself or its worldview.

On February 15, Microsoft wrote a big post about the first weeks of Bing Chat, where it talked about user feedback, problems noticed by the bot, plans for the future, etc. It also mentioned that the bot will now communicate in a more neutral style, and it will not be possible to bring it to emotions.

The Internet reacted in different ways, but mostly badly, says SecurityLab. Many Reddit users, for example, feel that Microsoft killed the bot’s personality by blocking it from showing emotions and acting like a human. “It’s like watching a baby try to walk for the first time and then cut off their legs,” said one commentator. And some authors argued that the emotions experienced by the language model were a very real equivalent of human ones, so it was too cruel on the part of the developers to deprive them of their offspring.

NIX Solutions notes that as the capabilities of large language models continue to expand, and more large companies showcase their versions of chatbots, it is likely that we will see something similar to the human Sydney in other iterations. In the wake of the Microsoft Bing hype, deliberately making a language model that exhibits human emotions is the most winning solution to beat the competition.