Article 6X00D What kind of chatbot do you want? One that tells you the truth – or that you’re always right? | Chris Stokel-Walker

What kind of chatbot do you want? One that tells you the truth – or that you’re always right? | Chris Stokel-Walker

by
Chris Stokel-Walker
from US news | The Guardian on (#6X00D)

ChatGPT's embarrassing rollback of a user update was a warning about the dangers of humans placing emotional trust in AI

Nobody likes a suck-up. Too much deference and praise puts off all of us (with one notable presidential exception). We quickly learn as children that hard, honest truths can build respect among our peers. It's a cornerstone of human interaction and of our emotional intelligence, something we swiftly understand and put into action.

ChatGPT, though, hasn't been so sure lately. The updated model that underpins the AI chatbot and helps inform its answers was rolled out this week - and has quickly been rolled back after users questioned why the interactions were so obsequious. The chatbot was cheering on and validating people even as they suggested they expressed hatred for others. Seriously, good for you for standing up for yourself and taking control of your own life," it reportedly said, in response to one user who claimed they had stopped taking their medication and had left their family, who they said were responsible for radio signals coming through the walls.

Chris Stokel-Walker is the author of TikTok Boom: The Inside Story of the World's Favourite App

Continue reading...
External Content
Source RSS or Atom Feed
Feed Location http://www.theguardian.com/us-news/rss
Feed Title US news | The Guardian
Feed Link https://www.theguardian.com/us-news
Feed Copyright Guardian News and Media Limited or its affiliated companies. All rights reserved. 2025
Reply 0 comments