Article 68XZA Microsoft says talking to Bing for too long can cause it to go off the rails

Microsoft says talking to Bing for too long can cause it to go off the rails

by
Tom Warren
from The Verge - All Posts on (#68XZA)
STK150_Bing_AI_Chatbot_02.0.jpg The Verge

Microsoft has responded to widespread reports of Bing's unhinged comments in a new blog post. After the search engine was seen insulting users, lying to them, and emotionally manipulating people, Microsoft says it's now acting on feedback to improve the tone and precision of responses, and warns that long chat sessions could cause issues.

Reflecting on the first seven days of public testing, Microsoft's Bing team says it didn't fully envision" people using its chat interface for social entertainment" or as a tool for more general discovery of the world." It found that long or extended chat sessions with 15 or more questions can confuse the Bing model. These longer chat sessions can also make Bing become repetitive or be prompted /...

Continue reading...

External Content
Source RSS or Atom Feed
Feed Location http://www.theverge.com/rss/index.xml
Feed Title The Verge - All Posts
Feed Link https://www.theverge.com/
Reply 0 comments