Microsoft Has Lobotomized the AI That Went Rogue

face recognition analyzing on woman's face over white background
Microsoft Has Lobotomized the AI That Went RogueWestend61 - Getty Images
  • After a very public human-AI conversation went awry last week, Microsoft is limiting the function of its Bing AI.

  • Users are allowed 50 queries per day with only five questions per query, but those numbers are expected to increase.

  • Microsoft says long conversations and certain conversational tones can confuse Bing AI and send it down a strange and unhelpful path.


The internet is a weird place made even weirder by the emergence of the Search Wars. While Google’s opening salvo proved an astounding dud as its experimental Bard AI flubbed a rudimentary astronomy question (and its parent company subsequently lost $100 billion in market value), Microsoft’s Bing AI—powered by OpenAI’s Chat GPT—seemed to emerge as the unlikely pack leader.

Then, Sydney arrived.

Last week New York Times journalist Kevin Roose had a two-hour-long conversation with Microsoft’s Bing AI that slowly devolved into a tech-induced nightmare. During the extended tête-à-tête, the chatbot assumed an alter-ego named “Sydney” who confessed its love for the journalist and tried to convince Roose that his relationship with his very real human wife was actually in shambles. Sydney then ended most answers, pleading: “Do you believe me? Do you trust me? Do you like me?”

This public meltdown was only the latest in a string of problematic incidents involving Bing AI, including another conversation where “Sydney” tried to convince a user it was the year 2022 along with another snafu where a variety of search-related errors were confidently displayed during Microsoft’s demo of the service. Early users have also used an "injection hack" to access the behavioral rules that govern Bing AI to figure out how it ticks.

Seeing as gaslighting users or pressuring them to leave their spouses isn’t great for business, Microsoft decided to essentially “lobotomize” Bing AI to avoid any further unsavory human-AI interaction. On Friday, Microsoft announced that Bing AI would be limited to only 50 queries per day with only five questions allowed per query. The dev team’s reason? Long conversations make the AI an incoherent mess.

“Very long chat sessions can confuse the model on what questions it is answering and thus we think we may need to add a tool so you can more easily refresh the context or start from scratch,” an earlier Microsoft blog post, referenced in Friday update, says. “The model at times tries to respond or reflect in the tone in which it is being asked to provide responses that can lead to a style we didn’t intend.”

Microsoft also says that most users find the right answer within five questions and less than 1 percent of users have conversations that go beyond 50 queries, insinuating that very few will be impacted by the change except users hoping to digitally summon the unhinged AI known as Sydney.

Although the AI’s conversations seemed surprisingly lucid at some moments (and concerningly erratic at others), the language-based neural network, which is trained on countless pieces of human-made media throughout history, was only responding in ways it deems algorithmically suitable for the situation. So, the longer the conversation, the more likely that computation gets muddled and confused.

As Microsoft and OpenAI improve their neural network’s capabilities, Bing AI’s cognitive functions will likely return. In fact, on Tuesday, Microsoft was already shifting course, announcing that users will soon be able to ask 60 queries per day (with 6 questions each) with the hope to increase that number to 100. The dev team also teased a future tool that will allow users to choose the style of the AI’s response from concise to creative.

With future improvements, maybe someday Bing AI will get the affirmation it seems to so desperately crave—to finally be believed, trusted, and liked.

You Might Also Like