Microsoft restricts use of Bing chatbot after confusing answers
The Microsoft search engine Bing was inferior to Google for years – now the integration of an AI should help. Microsoft is now fighting the system's teething problems.
Redmond Microsoft has restricted use of its Bing chatbot, which uses artificial intelligence to answer even complex questions and have lengthy conversations. The software group is thus reacting to a number of incidents in which the text robot got out of hand and formulated answers that were perceived as encroaching and inappropriate.
In a blog post, the company announced that it would now limit Bing chats to 50 questions per day and five per session. “Our data showed that the vast majority of people find the answers they are looking for within 5 rounds,” the Bing team explained. Only about one percent of chat conversations contain more than 50 messages. When users reach the limit of five entries per session, Bing will prompt them to start a new topic.
Microsoft had previously warned against engaging in lengthy conversations with the AI chatbot, which is still in a testing phase. Longer chats with 15 or more questions could result in Bing “repeating itself or prompting or provoking responses that aren’t necessarily helpful or don’t match our intended tonality.”
A test of the Bing chatbot by a reporter from the New York Times caused a stir on the Internet. In a dialogue lasting more than two hours, the chatbot claimed that he loved the journalist. He then asked the reporter to separate from his wife.