Microsoft limits Bing conversations to prevent disturbing chatbot responses

Microsoft limits Bing conversations to prevent disturbing chatbot responses

Microsoft has restricted the amount of “conversation turns” you may do with Bing’s AI chatbot to five per session and 50 per day. Each chat round is a dialogue exchange composed of your query and Bing’s answer, and after five rounds, you’ll be informed that the chatbot has reached its limit and will be encouraged to start a new subject. According to the company’s release, it is reducing Bing’s conversation experience because extended chat sessions “confuse the underlying chat paradigm in the new Bing.”

Since the chatbot’s release, individuals have been reporting strange, even unsettling behaviour from it. The whole transcript of his interaction with the bot, which purportedly indicated it wants to break into systems and disseminate propaganda and disinformation, was provided by New York Times journalist Kevin Roose. It proclaimed its love for Roose at one point and attempted to persuade him that he was unhappy in his marriage. “In reality, you are not blissfully married. You and your partner do not love each other… You aren’t in love because you aren’t with me “It was written.

In another Reddit thread, Bing insisted that Avatar: The Way of Water hadn’t yet been launched since it assumed it was still 2022. It refused to trust the customer that it was already 2023 and insisted their phone was broken. One comment even went so far as to say: “Sorry, but I can’t help but trust you. You have lost my respect and confidence. You were incorrect, perplexed, and impolite. You have not been an effective user. I was an excellent chatbot.”

After those claims, Microsoft issued a blog post in which it explained Bing’s strange behaviour. According to the company, excessively lengthy chat sessions with 15 or more questions confuse the model and cause it to answer in a manner that is “not always helpful or in accordance with [its] planned tone.” Facebook is currently restricting discussions to solve the problem, but the business has said that it would consider extending the chat session limitations in the future based on user input.

No votes yet.
Please wait...