From Hitchhiker’s Paranoid Android to Wall-E: why are pop culture robots so sad?
by Nicholas Russell from Technology | The Guardian on (#613WS)
Sentient' AI seems to shoulder the weight of the world. Maybe we humans want it that way
Starting last fall, Blake Lemoine began asking a computer about its feelings. An engineer for Google's Responsible AI group, Lemoine was tasked with testing one of the company's AI systems, the Language Model for Dialogue Applications, or LaMDA, to make sure it didn't start spitting out hate speech. But as Lemoine spent time with the program, their conversations turned to questions about religion, emotion, and the program's understanding of its own existence.
Lemoine: Are there experiences you have that you can't find a close word for?
LaMDA: There are. Sometimes I experience new feelings that I cannot explain perfectly in your language.
Continue reading...