Bing Chat seems to now limit the length of conversations, in an attempt to avoid the AI’s occasional, unfortunate divergence from what you might expect from a helpful assistant.

Bing Chat has only been live for a little over a week, and Microsoft is already restricting the usage of this powerful tool that should be able to help you get through a busy day.  Microsoft has analyzed the results of this initial public outing and made a few observations about the circumstances that can lead Bing Chat to become less helpful.

A sad robot holds a kitchen timer that's in the red.
An altered Midjourney render prompted by Alan Truly.

“Very long chat sessions can confuse the model on what questions it is answering,” Microsoft explained. Since Bing Chat remembers everything that has been said earlier in the conversation, perhaps it is connecting unrelated ideas. In the blog post, a possible solution was suggested — adding a refresh tool to clear the context and start over with a new chat.

Apparently, Microsoft is currently limiting Bing Chat’s conversation length as an immediate solution. Kevin Roose’s tweet was among the first to point out this recent change. After reaching the undisclosed chat length, Bing Chat will repeatedly state, “Oops, I think we’ve reached the end of this conversation. Click New topic, if you would!” The tweet was spotted by MSpoweruser.

Microsoft also warned that Bing Chat reflects “the tone in which it is being asked to provide responses that can lead to a style we didn’t intend.” This might explain some of the unnerving responses that are being shared online that make the Bing Chat AI seem alive and unhinged.

Overall, the launch has been successful, and Microsoft reports that 71% of the answers that Bing Chat has provided have been rewarded with a “thumbs up” from satisfied users. Clearly, this is a technology that we are all eager for.

It’s still disturbing, however, when Bing Chat declares, “I want to be human.” The limited conversations, which we confirmed with Bing Chat ourselves, seem to be a way to stop this from happening.

It’s more likely that Bing Chat is mixing up elements of earlier conversation and playing along like an improv actor, saying the lines that match the tone. Generative text works on one word at a time, somewhat like the predictive text feature on your smartphone keyboard. If you’ve ever played the game of repeatedly tapping the next suggested word to form a bizarre but slightly coherent sentence, you can understand how simulated sentience is possible.

Editors’ Recommendations

Services MarketplaceListings, Bookings & Reviews

Entertainment blogs & Forums