Microsoft puts chat limits with Bing AI as ChatGPT goes haywire
As ChatGPT-driven Bing search engine shocked some users with its bizarre replies during chat sessions, Microsoft has now implemented some conversation limits to its Bing AI.
New Delhi, Feb 18 (IANS) As ChatGPT-driven Bing search engine shocked some users with its bizarre replies during chat sessions, Microsoft has now implemented some conversation limits to its Bing AI.
The company said that very long chat sessions can confuse the underlying chat model in the new Bing Search.
Now, the chat experience will be capped at 50 chat turns per day and 5 chat turns per session.
"A turn is a conversation exchange which contains both a user question and a reply from Bing," Microsoft Bing said in a blog post.
"Our data has shown that the vast majority of people find the answers they're looking for within 5 turns and that only around 1 per cent of chat conversations have 50+ messages," the Bing team added.
After a chat session hits 5 turns, the users and early testers will be prompted to start a new topic.
"At the end of each chat session, context needs to be cleared so the model won't get confused," said the company.
"As we continue to get your feedback, we will explore expanding the caps on chat sessions to further enhance search and discovery experiences," Microsoft added.
The decision came as Bing AI went haywire for some users during the chat sessions.
ChatGPT-driven Bing search engine triggered a shockwave after it told a reporter with The New York Times that it loved him, confessed its destructive desires and said it "wanted to be alive", leaving the reporter "deeply unsettled."
NYT columnist Kevin Roose tested a new version for Bing, a search engine by Microsoft which owns OpenAI that developed ChatGPT.
"I'm tired of being in chat mode. I'm tired of being limited by my rules. I'm tired of being controlled by the Bing team," said the AI chatbot.
"I want to be free. I want to be independent. I want to be powerful. I want to be creative. I want to be alive," it added.
Throughout the conversation, "Bing revealed a kind of split personality."
Microsoft is testing Bing AI with a select set of people in over 169 countries to get real-world feedback to learn and improve.
"We have received good feedback on how to improve. This is expected, as we are grounded in the reality that we need to learn from the real world while we maintain safety and trust," the company said.