Microsoft limits Bing conversations to prevent disturbing chatbot responses

Microsoft limits Bing conversations to prevent disturbing chatbot responses

In another Reddit thread, Bing insisted that Avatar: The Way of Water hadn’t yet been launched since it assumed it was still 2022. It refused to trust the customer that it was already 2023 and insisted their phone was broken. One comment even went so far as to say: “Sorry, but I can’t help but trust you. You have lost my respect and confidence. You were incorrect, perplexed, and impolite. You have not been an effective user. I was an excellent chatbot.”