Microsoft provides personality settings for Bing's AI chatbot

Microsoft provides personality settings for Bing’s AI chatbot

Because early customers reported unusual behaviour during protracted discussions and ‘entertainment’ sessions, the business limited the Bing AI’s replies. According to The Verge, the limits irritated some users since the chatbot would simply reject to answer some inquiries. Microsoft has steadily lifted restrictions since then, and the AI was recently modified to lessen both unresponsiveness and “hallucinations.” The bot may not be as bizarre, but it should be more eager to satisfy your curiosity.