Article 5MFWW Disability rights advocates are worried about discrimination in AI hiring tools

Disability rights advocates are worried about discrimination in AI hiring tools

by
Sheridan Wall, Hilke Schellmann
from MIT Technology Review on (#5MFWW)
Story Image

Your ability to land your next job could depend on how well you play one of the AI-powered games that companies like AstraZeneca and Postmates are increasingly using in the hiring process.

Some companies that create these games, like Pymetrics and Arctic Shores, claim that they limit bias in hiring. But AI hiring games can be especially difficult to navigate for job seekers with disabilities.

In the latest episode of MIT Technology Review's podcast In Machines We Trust," we explore how AI-powered hiring games and other tools may exclude people with disabilities. And while many people in the US are looking to the federal commission responsible for employment discrimination to regulate these technologies, the agency has yet to act.

To get a closer look, we asked Henry Claypool, a disability policy analyst, to play one of Pymetrics's games. Pymetrics measures nine skills, including attention, generosity, and risk tolerance, that CEO and cofounder Frida Polli says relate to job success.

When it works with a company looking to hire new people, Pymetrics first asks the company to identify people who are already succeeding at the job it's trying to fill and has them play its games. Then, to identify the skills most specific to the successful employees, it compares their game data with data from a random sample of players.

When he signed on, the game prompted Claypool to choose between a modified version-designed for those with color blindness, ADHD, or dyslexia-and an unmodified version. This question poses a dilemma for applicants with disabilities, he says.

The fear is that if I click one of these, I'll disclose something that will disqualify me for the job, and if I don't click on-say-dyslexia or whatever it is that makes it difficult for me to read letters and process that information quickly, then I'll be at a disadvantage," Claypool says. I'm going to fail either way."

Polli says Pymetrics does not tell employers which applicants requested in-game accommodations during the hiring process, which should help prevent employers from discriminating against people with certain disabilities. She added that in response to our reporting, the company will make this information more clear so applicants know that their need for an in-game accommodation is private and confidential.

The Americans with Disabilities Act requires employers to provide reasonable accommodations to people with disabilities. And if a company's hiring assessments exclude people with disabilities, then it must prove that those assessments are necessary to the job.

For employers, using games such as those produced by Arctic Shores may seem more objective. Unlike traditional psychometric testing, Arctic Shores's algorithm evaluates candidates on the basis of their choices throughout the game. However, candidates often don't know what the game is measuring or what to expect as they play. For applicants with disabilities, this makes it hard to know whether they should ask for an accommodation.

Safe Hammad, CTO and cofounder of Arctic Shores, says his team is focused on making its assessments accessible to as many people as possible. People with color blindness and hearing disabilities can use the company's software without special accommodations, he says, but employers should not use such requests to screen out candidates.

The use of these tools can sometimes exclude people in ways that may not be obvious to a potential employer, though. Patti Sanchez is an employment specialist at the MacDonald Training Center in Florida who works with job seekers who are deaf or hard of hearing. About two years ago, one of her clients applied for a job at Amazon that required a video interview through HireVue.

Sanchez, who is also deaf, attempted to call and request assistance from the company, but couldn't get through. Instead, she brought her client and a sign language interpreter to the hiring site and persuaded representatives there to interview him in person. Amazon hired her client, but Sanchez says issues like these are common when navigating automated systems. (Amazon did not respond to a request for comment.)

Making hiring technology accessible means ensuring both that a candidate can use the technology and that the skills it measures don't unfairly exclude candidates with disabilities, says Alexandra Givens, the CEO of the Center for Democracy and Technology, an organization focused on civil rights in the digital age.

AI-powered hiring tools often fail to include people with disabilities when generating their training data, she says. Such people have long been excluded from the workforce, so algorithms modeled after a company's previous hires won't reflect their potential.

Even if the models could account for outliers, the way a disability presents itself varies widely from person to person. Two people with autism, for example, could have very different strengths and challenges.

As we automate these systems, and employers push to what's fastest and most efficient, they're losing the chance for people to actually show their qualifications and their ability to do the job," Givens says. And that is a huge loss."

A hands-off approach

Government regulators are finding it difficult to monitor AI hiring tools. In December 2020, 11 senators wrote a letter to the US Equal Employment Opportunity Commission expressing concerns about the use of hiring technologies after the covid-19 pandemic. The letter inquired about the agency's authority to investigate whether these tools discriminate, particularly against those with disabilities.

The EEOC responded with a letter in January that was leaked to MIT Technology Review. In the letter, the commission indicated that it cannot investigate AI hiring tools without a specific claim of discrimination. The letter also outlined concerns about the industry's hesitance to share data and said that variation between different companies' software would prevent the EEOC from instituting any broad policies.

I was surprised and disappointed when I saw the response," says Roland Behm, a lawyer and advocate for people with behavioral health issues. The whole tenor of that letter seemed to make the EEOC seem like more of a passive bystander rather than an enforcement agency."

The agency typically starts an investigation once an individual files a claim of discrimination. With AI hiring technology, though, most candidates don't know why they were rejected for the job. I believe a reason that we haven't seen more enforcement action or private litigation in this area is due to the fact that candidates don't know that they're being graded or assessed by a computer," says Keith Sonderling, an EEOC commissioner.

Sonderling says he believes that artificial intelligence will improve the hiring process, and he hopes the agency will issue guidance for employers on how best to implement it. He says he welcomes oversight from Congress.

However, Aaron Rieke, managing director of Upturn, a nonprofit dedicated to civil rights and technology, expressed disappointment in the EEOC's response: I actually would hope that in the years ahead, the EEOC could be a little bit more aggressive and creative in thinking about how to use that authority."

Pauline Kim, a law professor at Washington University in St. Louis, whose research focuses on algorithmic hiring tools, says the EEOC could be more proactive in gathering research and updating guidelines to help employers and AI companies comply with the law.

Behm adds that the EEOC could pursue other avenues of enforcement, including a commissioner's charge, which allows commissioners to initiate an investigation into suspected discrimination instead of requiring an individual claim (Sonderling says he is considering making such a charge). He also suggests that the EEOC consult with advocacy groups to develop guidelines for AI companies hoping to better represent people with disabilities in their algorithmic models.

It's unlikely that AI companies and employers are screening out people with disabilities on purpose, Behm says. But they haven't spent the time and effort necessary to understand the systems that are making what for many people are life-changing decisions: Am I going to be hired or not? Can I support my family or not?"

External Content
Source RSS or Atom Feed
Feed Location https://www.technologyreview.com/stories.rss
Feed Title MIT Technology Review
Feed Link https://www.technologyreview.com/
Reply 0 comments