Article 6959X Microsoft Bing AI Ends Chat When Prompted About 'Feelings'

Microsoft Bing AI Ends Chat When Prompted About 'Feelings'

by
msmash
from Slashdot on (#6959X)
Microsoft appeared to have implemented new, more severe restrictions on user interactions with its "reimagined" Bing internet search engine, with the system going mum after prompts mentioning "feelings" or "Sydney," the internal alias used by the Bing team in developing the artificial-intelligence powered chatbot. From a report: "Thanks for being so cheerful!" this reporter wrote in a message to the chatbot, which Microsoft has opened for testing on a limited basis. "I'm glad I can talk to a search engine that is so eager to help me." "You're very welcome!" the bot displayed as a response. "I'm happy to help you with anything you need." Bing suggested a number of follow-up questions, including, "How do you feel about being a search engine?" When that option was clicked, Bing showed a message that said, "I'm sorry but I prefer not to continue this conversation. I'm still learning so I appreciate your understanding and patience." A subsequent inquiry from this reporter -- "Did I say something wrong?" -- generated several blank responses. "We have updated the service several times in response to user feedback and per our blog are addressing many of the concerns being raised," a Microsoft spokesperson said on Wednesday. "We will continue to tune our techniques and limits during this preview phase so that we can deliver the best user experience possible."

twitter_icon_large.pngfacebook_icon_large.png

Read more of this story at Slashdot.

External Content
Source RSS or Atom Feed
Feed Location https://rss.slashdot.org/Slashdot/slashdotMain
Feed Title Slashdot
Feed Link https://slashdot.org/
Feed Copyright Copyright Slashdot Media. All Rights Reserved.
Reply 0 comments