Article 68N1S ‘There is no standard’: investigation finds AI algorithms objectify women’s bodies

‘There is no standard’: investigation finds AI algorithms objectify women’s bodies

by
Gianluca Mauro and Hilke Schellmann
from Technology | The Guardian on (#68N1S)

Guardian exclusive: AI tools rate photos of women as more sexually suggestive than those of men, especially if nipples, pregnant bellies or exercise is involved

  • This story was produced in partnership with the Pulitzer Center's AI Accountability Network

Images posted on social media are analyzed by artificial intelligence (AI) algorithms that decide what to amplify and what to suppress. Many of these algorithms, a Guardian investigation has found, have a gender bias, and may have been censoring and suppressing the reach of countless photos featuring women's bodies.

These AI tools, developed by large technology companies, including Google and Microsoft, are meant to protect users by identifying violent or pornographic visuals so that social media companies can block it before anyone sees it. The companies claim that their AI tools can also detect raciness" or how sexually suggestive an image is. With this classification, platforms - including Instagram and LinkedIn - may suppress contentious imagery.

Continue reading...
External Content
Source RSS or Atom Feed
Feed Location http://www.theguardian.com/technology/rss
Feed Title Technology | The Guardian
Feed Link https://www.theguardian.com/us/technology
Feed Copyright Guardian News and Media Limited or its affiliated companies. All rights reserved. 2024
Reply 0 comments