Article 68MFZ The new Microsoft Bing will sometimes misrepresent the info it finds

The new Microsoft Bing will sometimes misrepresent the info it finds

by
Sean Hollister
from The Verge - All Posts on (#68MFZ)
microsoftlogostock1_1020.0.0.jpg Photo by Tom Warren / The Verge

Search engines are about to change in a very important way: when you type in a query and get an official-looking answer, it might be wrong - because an AI chatbot created it.

Today, Microsoft announced a new version of its Bing search engine that will provide complete answers" to your questions by tapping into the power of ChatGPT. You can already try some canned sample searches and sign up for more.

But though Microsoft is taking many precautions compared to its 2016 failure with Tay - a chatbot that Twitter taught to be racist and misogynist in less than a day - the company's still proactively warning that some of the new Bing's results might be bad.

Here are a couple key passages from Microsoft's new Bing FAQ:

Bing tries to keep...

Continue reading...

External Content
Source RSS or Atom Feed
Feed Location http://www.theverge.com/rss/index.xml
Feed Title The Verge - All Posts
Feed Link https://www.theverge.com/
Reply 0 comments