Article 49F2F Facial Recognition Software Regularly Misgenders Trans People

Facial Recognition Software Regularly Misgenders Trans People

by
Matthew Gault
from on (#49F2F)
Story Image

Facial recognition software is a billion dollar industry, with Microsoft, Apple, Amazon, and Facebook developing systems, some of which have been sold to governments and private companies. Those systems are a nightmare for various reasons-some systems have, for example, been shown to misidentify black people in criminal databases while others have been unable to see black faces at all.

The problems can be severe for transgender and nonbinary people because most facial recognition software is programmed to sort people into two groups-male or female. Because these systems aren't designed with transgender and gender nonconforming people in mind, something as common as catching a flight can become a complicated nightmare. It's a problem that will only get worse as the TSA moves to a full biometric system at all airports and facial recognition technology spreads.

These biases programmed into facial recognition software means that transgender and gender nonconforming people may not be able to use facial recognition advancements that are at least nominally intended to make people's lives easier, and, perhaps more importantly, may be unfairly targeted, discriminated against, misgendered, or otherwise misidentified by the creeping surveillance state's facial recognition software.

Os Keyes, a "genderfucky nightmare goth PhD student" studies the intersection of human-computer interaction and social science at the University of Washington's Department of Human Centred Design & Engineering. To find out why automatic gender recognition (AGR) is so ubiquitous, Keyes looked at the past 30 years of facial recognition research.

They studied 58 separate research papers to see how those researchers handled gender. It wasn't good. Keyes found that researchers followed a binary model of gender more than 90 percent of the time, viewed gender as immutable more than 70 percent of the time, and-in research focused specifically on gender-viewed it as a purely physiological construct more than 80 percent of the time.

"Such a model fundamentally erases transgender people, excluding their concerns, needs and existences from both design and research," Keyes wrote in The Misgendering Machines, a research paper they published in November. "The consequence has been a tremendous under representation of transgender people in the literature, recreating discrimination found in the wider world. AGR research fundamentally ignores the existence of transgender people, with dangerous results."

"I couldn't help but be personally, as well as professionally annoyed by the approach that the field took to gender-of assuming these two very monolithic and universal categories of gendered experience," Keyes told me over the phone. "Pretty much every paper I read did it."

"We're talking about the extension of trans erasure"

The bias against trans and nonbinary people was everywhere, from research to suggested applications of the technology. It seemed hardcoded. A 2015 research paper on AGR from the National Institute of Standards and Technology (NIST), the oldest federally funded science lab in America, suggested people could use facial recognition software to sound an alarm around women's bathrooms if men got too close. "An operator may be alerted when a male is detected in view," the paper suggested.

"Precisely why this technology is necessary for bathroom access control is not clear: most AGR papers do not dedicate any time to discussing the purported problem this technology is a solution to," Keyes wrote in their paper. "The only clue comes from the NIST report which states that: 'the cost of falsely classifying a male as a female...could result in allowing suspicious or threatening activity to be conducted,' a statement disturbingly similar to the claims and justifications made by advocates of anti-trans 'bathroom bills."

Problems and prejudices like that cropped up again and again in Keyes research. "Three of [the research papers] focused on trans people. Zero of them focused on non-binary trans people, in the entire 30 year history of the field," Keyes told me.

Machines aren't value neutral, they act as they're programmed. "We're talking about the extension of trans erasure," Keyes said. "That has immediate consequences. The more stuff you build a particular way of thinking into, the hard it is to unpick that way of thinking."

Technology is a feedback loop-the values we build into our machines are then taught to anyone who uses it. "So when we build a particular set of values, into new spaces, and new systems, not only we making them exclusive spaces and systems and making it harder to have a world that is more inclusive overall, we're also communicating to people who try and enter-'this is how gender works, these are the categories that you can live in, this is how your gender is determined,'" Keyes explained. "Any conflict or dissonance you have with that is your problem because this is a faceless machine."

As facial recognition technology spreads, problems will arise for anyone who doesn't fit the "norm" the technology was designed to recognize. This is already a problem. In 2018, MIT researchers Joy Buolamwini and Timnit Gebru published research that pointed out the AI running facial recognition software was overwhelmingly trained with white faces and led to an increased number of false positives for any other shade of skin. "A false positive carries different weights, depending on who the subject is," Keyes explained. When traditionally marginalized groups interact with law enforcement, there's a disproportionate chance they'll end up dead, hurt, or in jail.

Keyes doesn't see a need for any kind of AGR at all.

"Technologies need to be contextual and need-driven," they said. "What are the values of the people who use the space that you're deploying a technology in? Do the people in that space actually need it? If we're not discussing gender at all, or race at all...it doesn't necessarily lead to a better world."

They said that one way to solve this problem is to do a better job of teaching social sciences such as ethics and gender studies to computer science students. The more inherent biases are studied, the easier they are to avoid when designing new technologies. "The average [computer science] student is never going to take a gender studies class," Keyes said. "They're not probably going to even take an ethics class. It'd be good if they did."

External Content
Source RSS or Atom Feed
Feed Location http://motherboard.vice.com/rss
Feed Title
Feed Link http://motherboard.vice.com/
Reply 0 comments