Article 6SY3Y She didn’t get an apartment because of an AI-generated score – and sued to help others avoid the same fate

She didn’t get an apartment because of an AI-generated score – and sued to help others avoid the same fate

by
Johana Bhuiyan
from on (#6SY3Y)

Despite a stellar reference from a landlord of 17 years, Mary Louis was rejected after being screened by firm SafeRent

Three hundred twenty-four. That was the score Mary Louis was given by an AI-powered tenant screening tool. The software, SafeRent, didn't explain in its 11-page report how the score was calculated or how it weighed various factors. It didn't say what the score actually signified. It just displayed Louis's number and determined it was too low. In a box next to the result, the report read: Score recommendation: DECLINE".

Louis, who works as a security guard, had applied for an apartment in an eastern Massachusetts suburb. At the time she toured the unit, the management company said she shouldn't have a problem having her application accepted. Though she had a low credit score and some credit card debt, she had a stellar reference from her landlord of 17 years, who said she consistently paid her rent on time. She would also be using a voucher for low-income renters, guaranteeing the management company would receive at least some portion of the monthly rent in government payments. Her son, also named on the voucher, had a high credit score, indicating he could serve as a backstop against missed payments.

Continue reading...
External Content
Source RSS or Atom Feed
Feed Location http://www.theguardian.com/technology/rss
Feed Title
Feed Link http://www.theguardian.com/
Reply 0 comments