Article 5B4XY Podcast: Facial recognition is quietly being used to control access to housing and social services

Podcast: Facial recognition is quietly being used to control access to housing and social services

by
Tate Ryan-Mosley
from MIT Technology Review on (#5B4XY)

Facial recognition technology is being deployed in housing projects, homeless shelters, schools, even across entire cities-usually without much fanfare or discussion. To some, this represents a critical technology for helping vulnerable communities gain access to social services. For others, it's a flagrant invasion of privacy and violation of human dignity. In this episode, we speak to the advocates, technologists, and dissidents dealing with the messy consequences that come when a technology that can identify you almost anywhere (even if you're wearing a mask) is deployed without any clear playbook for regulating or managing it.

We meet:
  • Eric Williams, senior staff attorney at Detroit Justice Center
  • Fabian Rogers, community advocate at Surveillance Technology Oversight Project
  • Helen Knight, founder of Tech for Social Good
  • Ray Bolling, president and co-founder of Eyemetric Identity Systems
  • Mary Sunden, executive director of the Christ Church Community Development Corporation
Credits

This episode was reported and produced by Jennifer Strong, Tate Ryan-Mosley, Emma Cillekens, and Karen Hao. We're edited by Michael Reilly and Gideon Lichfield.

Transcript

[TR ID]

Strong: So, I'm in lower Manhattan next to some buildings known as Knickerbocker Village. You might hear the subway running up overhead there. So history buffs might know this spot as kind of a birthplace of housing rights. Some of New York City's first regulations on rental housing came to exist here because of the tenants association. These buildings were also among the very first federally funded affordable housing units. They're about 90 years old and were built to support a new middle class after the great depression. But the reason I'm here is because it's also among the very first apartment complexes in New York city to install facial recognition instead of keys. And this is not recently either. It was about seven years ago after Hurricane Sandy tore through the area, doing incredible damage and causing awful flooding. And it was in those repairs that the apartments received some upgrades, including this keyless entry system at all the gates that uses face ID.

And you know, it doesn't look super different from the usual call boxes that let you buzz somebody up and I did kind of wonder how easy they'd be to spot, but it's actually super easy for a few reasons. One is the days are short now. And so it's kind of dark out here and all the entry points have these really bright spotlights shining on them, I guess, to light people's faces and make it easier for the system to detect them. The other is this kind of funny little dance, maybe it's like a lean in lean out motion that some of the folks trying to get into the apartment building are doing as they get closer to these cameras. It's pretty chilly out here. So I'm guessing they're trying to get the door to unlock more quickly.

Strong: Problems with this setup.. aren't new. It's well documented that facial recognition often doesn't work as well on non-white faces... which explains why some residents have had difficulty gaining access to this building. It's also unlocked the door to faces of non-residents in the past. In the future, use of this technology might be regulated... but for now, how and where it gets used remains a wild west. I'm Jennifer Strong and as part of our latest miniseries on face I-D, we're going to look at use in housing, and a bunch of thorny questions that raises.

[SHOW ID]

[Sound from Green Light Announcement: and I welcome you all today to the Detroit public safety headquarters for this exciting announcement of Project Green Light Detroit.]

Strong: That was 2016... and the birth of what would become one of the more controversial surveillance projects in the US.

[Sound from Green Light Announcement: The violence level in this city is not acceptable.]

Strong: For decades, Detroit, Michigan has been listed among the most violent cities in the country. This is Mike Duggan - the city's mayor.

[Sound from Green Light Announcement: But if we don't do something differently we are not going to change the trajectory.... and so how many times have you seen on TV if you recognize this person call in and there is such a grainy image you have no idea. It could be your own brother and you wouldn't know from that image...]

Strong: And so, they tried something new... a collaboration between police and local businesses.. with the latest technology.

[Sound from Green Light Announcement: ...and we said.. what if we went to some gas stations, who volunteered, and set a standard that you light your gas station all the way to the perimeter so bright you can read a magazine. And then and you put such high definition color camerasand put in an internet connection we will monitor it real time at police headquarters.]

Strong: in other words...private companies pay to install top quality cameras that stream live video back to police headquarters... and in return, they receive more time and services from police.

[Sound from Green Light Announcement (Mayor Mike Duggan): ... ok so it is on that screen right there]

Strong: Next.. he shows people at the press conference how this system works.

[Sound from Green Light Announcement (Mayor Mike Duggan): but I want you to see the visibility. [wow!] That is what our officers are looking at upstairs in the real time crime centre right now. They can see dozens of screens at once...I mean how much different is this.]

Strong: And as you can hear from the reaction of the crowd, the quality of this video is a total game changer.

[Sound from Green Light Announcement (Mayor Mike Duggan): This is what we are going to do across the city.... And the other thing is...We've got facial recognition software coming next. We are going to be able... before too long... to match outstanding warrants against these cameras... So this is what it looks like at night. Take a look. 10 o'clock at night. You can see how bright the lighting is. And this is why it was so critical that we have the lighting standards in addition to the camera standard.]

Strong: Today, Project Green Light's cameras cover hundreds of shops... They're also pointed at public housing... schools... and transportation. But when it started five years ago it was just a handful of gas stations.

[Sound from Green Light Announcement: So it fell upon us as small business owners to be that catalyst. Because for Detroit to be a great city and come back... the neighborhoods to come back. They have to be safe. Public safety is a right of every citizen in the city Detroit and we are honored to be part of it]

Strong: At the time there was a lot of support for Project Green Light... many saw it as the community coming together... working with each other and with police to take back its streets...

[Sound from Green Light Announcement: Project Greenlight is good news. And not just for our police, but it's great news for our community. We should celebrate it. We should support it. We should expand it. When people talk about neighborhoods we not doing enough for neighbourhoods, well you need to tell them about this program. This is the spirit of Detroit. This just made my job so much easier and if i can have a video of the crime being committed... I've just won my case. (clapping)]

Strong: But looking back, it's not so straightforward.

Williams: There is absolutely no evidence whatsoever that project Green Light has produced a reduction in crime in Detroit.

Strong: Eric Williams is a senior staff attorney at the nonprofit law firm... Detroit Justice Center.

Williams: in terms of its success with the initial eight stations, there was a substantial reduction in criminal activity at those sites right year over year, since those eight sites, the findings are much less clear.

Strong: And activists including Williams have other concerns with this project.

Williams: Ironically project Green Light is something that I became involved in prior to actually going to the Detroit justice center. I was just, my sensibilities are just offended by its very existence. I have problems with that kind of police surveillance.

Strong: His main point of contention?

Williams: The problem that I have with Project Green Light in particular is the utter lack of transparency, right? For example, facial recognition technology was being used in conjunction with Project Green Light for almost two years before people really knew anything about it.

Strong: He says the public didn't really grasp what was happening. That their comings and goings would be livestreamed to police headquarters from hundreds of points about the city... and their faces could be scanned and identified. Though, as you heard, the city was quite clear from the day of launch that it planned to use facial recognition. It also provides a public map with the locations of all these cameras... and there are signs, often with a pulsing green light, that mark their presence. But... how many people really pay attention to local government officials talking about public private partnerships? And what would consent mean in this context? There's little choice when your home, job, school and local shops... are all surrounded by these cameras...

[Sound from protests in Detroit: chanting]

Strong: Over the summer... tensions between Detroit citizens and the police boiled to the surface. And... one of the key requests?

[Sound from protests in Detroit: These are our collective demands.... The second one is... getting rid of Project Green Light...]

Williams: We'd like to see, for example, legislation prohibiting the use of facial recognition technology in public housing. Like right now project Green Light is on public buses, is in public schools and actually in places where poor people are disproportionately likely to be right, this is how that's working. So we'd like to see if nothing else, a moratorium of its use by law enforcement.

Strong: But facial recognition isn't just creeping into public housing via police. As popularity of the technology grows and the cost of acquiring it drops, landlords big and small are looking to install it as a way to increase security or present the property as having the latest upgrades and conveniences like keyless entry. Facial recognition isn't regulated, so, we simply don't know how often it's used or whether those uses are effective. But we did meet a tenant who, at least for the moment, has stopped his landlord from installing a system that unlocks doors withfaces instead of keys.

Fabian: I come from Atlantic Plaza towers. Where our landlord tried to install facial recognition into the building

Strong: Fabian Rogers is a tenant and community advocate in New York City.

Fabian: So, I'm here speaking on behalf of those who usually don't have the time or money to do so.

Strong: Atlantic Plaza is a rent-stabilized housing complex in Brooklyn. Two years ago he found a notice in his mailbox about improvements to the buildings.

Fabian: We didn't know what we were dealing with at the time when everybody did background research trying to understand what was it that the landlord was trying to put in, cause this wasn't your typical major capital improvement where he's just trying to change the awnings on the building or trying to update the, the, the terraces. He's trying to put in facial recognition at the front. And so it will be mandatory to have to use the technology if it were to be deployed and installed in each building.

Strong: Together with more than a hundred other tenants... he decided to fight back.

Fabian: The concerns that we had as tenants was not for more security or more surveillance. We were already being over-surveilled in our building as is with 24, 24-7 HD recording cameras within every nook and cranny of these buildings except in our apartments and the fire escape stairs. We just understood this as another tool to pester with our living situation. This wasn't just welcoming a new technology that will help our lives be better. It was never that type of situation.

Strong: They self-organized and sent a letter to the management company opposing the cameras.

Fabian: To this day, we have never gotten a response back.

Strong: And this bugged them. So they enlisted the help of a legal firm... and they got a meeting with the landlord.

Fabian: What's the purpose of you installing this technology? Like, what's the purpose for this for us? What about the case of a police want subpoenas? And they find they have a suspect in the building. Now, what, like, what databases is, is does this tech access, what does it share? What does it save? How does it save? Who does it give the data to? And things like that, all of these ethical questions and logistical questions... they'd even wrapped their heads around it. They were just caught up in like we got some new tech from a startup company. We want to try to implement it so we can try to get some new tenants who are willing to pay a higher rent.

Strong: Ultimately, the cameras weren't installed. But that doesn't mean the issue is settled. Security projects like this one might get funding from the US Department of Housing and Urban Development but it doesn't track whether face ID gets used or how. And landlords can pretty much do what they want with it.

Fabian: And it's just scary. And again, this all happens because there's a lack of legislation. I'm not a stickler against technological innovation. That's not what I'm saying here. What I'm saying is we need to question the contextuality in which we're using new technology that is, you know, proliferating and is being deployed in society. Are they really ethical? Are they really better than the low tech solutions that we have? Is techno-solutionism always the answer to the, to the issue at hand and for housing I personally don't think so.

Strong: There is legislation before the House and Senate that would put a pause on using identity technologies in housing that's financially supported by the federal government. It's called the No Biometric Barriers to Housing Act". New York City is also considering coming up with its own rules on biometric data, housing and landlords, but blanket bans are concerning to some. Including a few people we met who provide housing to the homeless.

Knight: I think that we're losing an entire category of a way to serve and support people who don't have trust, but need help.

Strong: Helen Knight is the founder of a Candian startup that works with nonprofits. It's called tech for social good dot c-a. She argues there's a strong use case for face I-D to help vulnerable communities.

Knight: They are victims of crime. They have recently been evicted from their home. They're in a crisis situation and it's completely reasonable that they do not have a driver's license. Do not have a passport. Don't have any form of picture ID. They need help today. So you can't turn them away. You shouldn't turn them away. The intention is to provide them service.

Strong: But to do that effectively, shelters need to be able to show how many different people use their services and who those people are... It's how they're funded. It's also how they connect people to social services and other programs.

Knight: As it stands right now, the onus is on the person that's in the middle of a crisis situation to articulate their exact need I mean, without having an actual way to identify what their journey through the system has been and see where they've been to other facilities or food banks or, medical programs, then it's just, it's just me standing before you telling you my story over and over again, telling you about the trauma in my life. And hoping that I say the magic words that end up with getting me a house.

Strong: She directed use of this technology for a homeless shelter in the city of Calgary... and she says all the identification methods available... have problems.

Knight: So there was a facility that used fingerprint identification, uh, for over a decade and largely it works well. It doesn't work completely, depending on the people in the shelter facility. It doesn't really work on cold fingers where it doesn't work on dirty fingers. It doesn't work on damaged fingers and it doesn't work on missing fingers.

Strong: And issues with using somebody's fingerprints for ID run deeper than just whether they work or not...

Knight: it's not trauma-informed to have somebody walk into a facility and say, yes, you can absolutely come in, but let me just take your fingerprints. There's multiple problems. One, the screen is glowing green, which is a trigger and scary for people that are having paranoid delusions, perhaps they don't want to touch that, that scary thing that potentially, um, disease transmission, uh, but also the mental experience of having been arrested and fingerprinted in the past. You're actually repeating that same trauma every time they enter the facility.

Strong: She's also tried tracking devices powered by R-F-I-D.....which stands for radio-frequency identification. And they tried iris recognition, which is where a scanner reads a person's eyes. Ultimately though, she found face recognition to be the thing that worked.

Knight: So, uh, facial recognition is really the only one that is possible and available, um, at all times. ... // SO Facial Recognition is kinder emotionally in that it's passive, it, doesn't touch them and you can capture it more quickly and there's no risk of transmission. So there, I, I'm eager to see the extent extended application of facial recognition, because it's the kindest thing that you can do to people that have had traumatic lives and, and still know who they are and help them in the way that they need to be helped.

Bolling: Is that a photo of you?

Strong: Oh, it is. Yes that's my LinkedIn.

Bolling: So I guess I just identified you utilizing a photo that I took from LinkedIn that I enrolled in our software and then just utilizing a regular webcam with you wearing a mask. I was able to identify you.

Strong: Wow. Wow. That's that's me with a mask and a hood and headphones and whatever else around my face here.

Bolling: Hello, my name is Ray Bolling. I am the president and co-founder of eyemetric identity systems.

Strong: We're outside on a city street and he's giving me a runthrough of his software. His company creates applications powered by fingerprints, iris scans and now with facial recognition to identify someone when they enter a building ... including shelters for the homeless ... food banks... and public schools.

Bolling: We have seen an increased interest in this technology as a result of COVID. We need to be able to do things in a contactless manner, um, but still be able to properly identify people. Um, but do it in a way. Um, that's not involving the transfer, you know, our visitor management system where we'd ask someone to hand over their driver's license when there's an exchange of a card or an ID, um, you potentially could be transferring germs, utilizing a contactless method of facial recognition or iris it's something that, um, doesn't require any transfer or con or, uh, of information, you know, of a physical piece of plastic.

Strong: Other facial recognition systems we've talked about on this show sit on top of massive databases of photos taken from the internet. But not this one.

Bolling: Yes, our customers, when they purchase or install, one of our systems, our staff are starting with an empty database. They build the database themselves with known people and that are willingly and actively participating in the enrollment of the system. They are also actively involved when they are recognized.

Strong: In practice it might work something like this...

Bolling: Typically we will get an import of all the students or staff members from a school district. And we have all their information. Our first attempt is to try to enroll everybody off of a photograph, an existing photograph, but they have to be of good quality. Typically a school portrait picture works well, but there are cases due to lighting or hair that we need to retake that individual's photo, uh, to have them enroll directly into the system.

Strong: Then ....

Bolling: The students just walk up to a Microsoft surface tablet with a webcam. They stand in front of it and it signs them into school. ...

Strong: And he says it's more efficient than other options.

Bolling: ...It's important to the school district because they get reimbursed by the state based upon attendance. So they have a biometric means to verify the attendance of the kids that showed up that day. And it prevented students from borrowing or taking a friend's ID card and checking them in.

Strong: One of his clients is the Christ Church Community Development Corporation. It provides shelter, engagement and outreach services for the homeless in Bergen County, New Jersey. They use his company's fingerprint scanners, and over the next couple of months, they plan to also add facial recognition as another way to sign-in. Biometrics have helped them collect better data which they've used to tackle some big problems

Sunden: We are the first community in the country to have ended both veterans homelessness and chronic homelessness.

Strong: Mary Sunden is the group's executive director.

Sunden: And one of the main reasons that we were able to do that is because of the information we collect and the way in which we use it to help the folks we work with.

Strong: This work has received national recognition from the federal agency that oversees housing and urban development. And she says the additional data they gather has also helped them track trends over time.

Sunden: So what we've noticed in the last two years is a significant increase in people who are 60 and older, who are in the homeless system. This is not something that we ever had in our area 10 years ago, for example. So if all of a sudden, instead of 20% of the people who come to dinner every night are 60 years old or older, all of a sudden it's 30%. And that goes on for more than just a day or two, then we see something's going on and we need to go and look into it and try and figure out what's the cause behind that. But it's the data that shows you things changing. And then you have to put the personal work into figuring out the motivation behind it or the reasons behind it.

Strong: Right now... she's testing a camera that takes people's temperatures... and hopes it won't be a far leap for clients to accept the addition of facial recognition.

Sunden: So you walk in the building and it automatically checks your temperature. And it there's a picture of you. It's not storing your picture, but there you are, and it tells your temperature, it guesses your gender and your age incorrectly, we enjoy fooling it. I have changed, um, in the last week from an elderly to a middle-aged man or a woman. So that's kind of fun... but people I don't expect in this pandemic that when people are in our building again soon that they will have difficulty with that because they'll understand that the purpose of it is to keep them safe and. Life is just much more invasive now and we're used to it. Um, so I think that that has an advantage that we're trying to, we try to take advantage of things that happen. So the fact that, that, you know, people are used to being asked invasive questions about their health. We're just going to take advantage of that. And the facial rec is just another thing, not that different from the temperature scanning system.

Strong: Next episode...

Donnie Scott: You could see a future where as you arrive to the sporting event, it directs you to your parking based on recognizing your car or on sharing who you are from your phone. From parking, you're going to be directed through the shortest line. You know that line's going to move quickly because it's biometrically enabled.

Strong: We look at how facial recognition and other tracking systems are changing the sports experience in the stands and on the court.

Mike D'Auria: What this is intended to do is track the movement of every player and the ball 25 times a second. So you can kind of think over the course of one typical NBA basketball game, you're able to kind of capture millions of data points that didn't exist before and use those to build a suite of products or experiences on top of that can really change the way that we see and interact with sports.

Strong: This episode was reported and produced by me, Emma Cillekens, Tate Ryan-Mosley, and Karen Hao. We're edited by Michael Reilly and Gideon Lichfield. Thanks for listening, I'm Jennifer Strong.

[TR ID]

External Content
Source RSS or Atom Feed
Feed Location https://www.technologyreview.com/stories.rss
Feed Title MIT Technology Review
Feed Link https://www.technologyreview.com/
Reply 0 comments