Article 4ZEMA AI systems claiming to 'read' emotions pose discrimination risks

AI systems claiming to 'read' emotions pose discrimination risks

by
Hannah Devlin Science correspondent
from Technology | The Guardian on (#4ZEMA)

Expert says technology deployed is based on outdated science and therefore is unreliable

Artificial Intelligence (AI) systems that companies claim can "read" facial expressions is based on outdated science and risks being unreliable and discriminatory, one of the world's leading experts on the psychology of emotion has warned.

Lisa Feldman Barrett, professor of psychology at Northeastern University, said that such technologies appear to disregard a growing body of evidence undermining the notion that the basic facial expressions are universal across cultures. As a result, such technologies - some of which are already being deployed in real-world settings - run the risk of being unreliable or discriminatory, she said.

Continue reading...
External Content
Source RSS or Atom Feed
Feed Location http://www.theguardian.com/technology/rss
Feed Title Technology | The Guardian
Feed Link https://www.theguardian.com/us/technology
Feed Copyright Guardian News and Media Limited or its affiliated companies. All rights reserved. 2024
Reply 0 comments