Microsoft limits Bing conversations to prevent disturbing chatbot responses

Microsoft limits Bing conversations to prevent disturbing chatbot responses

After those claims, Microsoft issued a blog post in which it explained Bing’s strange behaviour. According to the company, excessively lengthy chat sessions with 15 or more questions confuse the model and cause it to answer in a manner that is “not always helpful or in accordance with [its] planned tone.” Facebook is currently restricting discussions to solve the problem, but the business has said that it would consider extending the chat session limitations in the future based on user input.