Article 5AHWZ Citizens are turning face recognition on unidentified police

Citizens are turning face recognition on unidentified police

by
Tate Ryan-Mosley
from MIT Technology Review on (#5AHWZ)

The new series of our AI podcast, In Machines We Trust, is all about face recognition. In part one of the series, Jennifer Strong and the team at MIT Technology Review explore the unexpected ways the technology is being used, including how it is being turned on police.

We meet:
  • Christopher Howell, data scientist and protester.
Credits:

This episode was reported and produced by Jennifer Strong, Tate Ryan-Mosley and Emma Cillekens, and Karen Hao. We're edited by Michael Reilly and Gideon Lichfield.

Transcript:

[Advertisement]

[TR ID]

Strong: A few things have happened since we last spoke about facial recognition. We've seen more places move to restrict its use while at the same time, schools and other public buildings have started using face ID as part of their covid-prevention plans. We're even using it on animals and not just on faces with similarities to our own, like chimps and gorillas, Chinese tech firms use it on pigs, and Canadian scientists are working to identify whales, even grizzly bears.

In other words, the number of ways we might use this technology is exploding as are concerns about whether that's feasible, let alone a good idea. And so, bans on how face ID can be used are expanding, from Portland, Maine to Portland, Oregon, where the toughest restrictions on facial recognition in the country are set to take effect next year banning not just its use by police, but commercial applications as well.

Wheeler: Colleagues, we are here this afternoon to consider two ordinances that seek to ban the use of facial recognition technologies by our own Portland city government and by private entities in the public spaces.

Strong: That's Portland Mayor Ted Wheeler opening the city council meeting that passed these bills in September. As with most things these days, the public comments and vote took place on Zoom.

Wheeler: Portland is far from an anti-technology city, and I want to make that very clear. We are not anti-technology with a wide array of local and national tech companies. We are one of the fastest growing tech hubs anywhere on the West coast.

Strong: Over the next few hours lawyers, software engineers, concerned citizens, and business leaders all had their say. Then, just before the vote took place. One last person, a local data scientist, raised his virtual hand to ask a question.

Wheeler: Christopher Howell... last but not least, welcome.

Howell: I would like to express a conditional support for this ordinance, but I have concerns.

Strong: That's because he's building something that uses facial recognition in a less than conventional way.

Howell: I'm involved with developing facial recognition to in fact use on Portland police officers since they are not identifying themselves to the public and they are committing crimes. Would this become illegal if you pass this ordinance?

[Music]

Strong: I'm Jennifer Strong and over the next several weeks we're going to do another deep dive on the rise of facial recognition. In this latest miniseries, we'll explore how it's being used in sports both on athletes and fans, its use in retail stores even how it's used to serve the homeless. But first, we kick things off with a look at how it's being turned back on police.

[SHOW ID]

Strong: The anonymity of police and other authority figures is steeped in a really complicated history. There's a delicate balance of privacy and accountability on both sides for protesters and for police.

[Sound of Chicago riots.]

Strong: The Chicago Riots of 1968 interrupted the TVs of Americans all over the country as they tuned in to watch the Democratic National Convention, which was held in Chicago that year.

[Sound of Chicago riots.]

Strong: Protesters were demonstrating against the Vietnam war and against the Democratic Party. After clashing with the police at Grant Park, protesters marched down Michigan Avenue to the hotel where many of the convention delegates were staying. For 17 minutes, America watched live as the Illinois National Guard fired tear gas... as the police beat the demonstrators.

[Sound of Chicago riots. The whole world's watching"]

Strong: The police weren't wearing name tags and this is actually pretty common when police deal with protesters. Sometimes it's legal, sometimes it's illegal, and sometimes it's even mandated in the interest of safety like this summer in Buffalo, New York.

Reporter: The Buffalo Police commissioner says it was his decision to have officers wear their badge numbers and not their names on their uniforms.

Commissioner: There was a rising harassment concern with officers and their family. In order to allow officers to do their job without fear, I made the decision at that time.

Strong: But when it happens, citizens and activists push back. Many argue this gives police even more power than they already have...because it becomes nearly impossible to hold them accountable.

Interviewee: We need our council to stand up and speak for the 255,000 people you represent and encourage and demand that this police force change the policy back instead of thinking solely of the dozen or so officers who were harassed, who could be protected under the current existing law.

Strong: But, well before the events of this summer, people have tried to identify unidentified police officers. They've used photos from protests, sleuthing on social media, crowdsourcing through the internet even resorted to stealing personal information. As in the case of a hacker collective - called Anonymous. It's known for aligning itself with the Occupy Wall Street protests and briefly taking down the New York Stock Exchange's website. That same group later leaked private information it stole from police and government websites including the home addresses of police officers.

Prank Call: You have reached the Baldwin County Sheriff's Office... [ring]

Strong: In this prank call, they claim credit for the hack.

[Excerpts from the prank call]

Strong: Unmasking those wielding force over others isn't unique to the U.S. Back in 2014 during Russia's annexation of Crimea you might recall that soldiers without badges on their green uniforms seized control. At the time top officials including President Vladimir Putin repeatedly denied that those troops were Russian.

Putin: There are no troops whatsoever. No Russian Troops at least.

Strong: This is Russia's ambassador to the EU speaking to reporters.

Ambassador: The United States being in the tradition of interfering in other countries and sending troops overseas, may be acting according to their own mentality... I would say. But this is not a case of Russian interference.

Strong: But it was. By matching photos, potentially with the help of facial recognition, the Ukrainian Government determined the troops were in fact tied to Russian military. And they released photos as proof. Later, Putin also admitted the troops were indeed Russian.

Putin Translation: Of course, the Russian servicemen did back the Crimean self-defence forces. They acted in a civil, as I've already said, but a decisive and professional manner.

[Music]

Strong: Now protesters are increasingly trying to turn Face I-D back on police - to identify officers who use excessive force. Including in Hong Kong, where last year dramatic images of clashes between protesters and police dominated news feeds showing police firing live bullets and protesters attacking officers. This is New York Times Reporter Paul Mozur speaking on public television.

Mozur: As the protests have gone on and police and Hong Kongers continue to square off week after week the face has become weaponized and identity itself in a way is weaponized. You know, protesters will go out and police will try to capture their images on video and then go back and identify them via all the social media and online materials that are out there and then vice versa. We saw the police actually take their badges off and so now protesters are doing the same to the police where they are trying to go back and use social media and figure out which police are doing what act. One protester in particular this guy Colin Cheung that we found created a facial recognition tool to try to identify police... and he didn't actually release the product but he says because of that police targeted him.

Strong: And Cheung was arrested shortly after he posted about the tool on Facebook... but similar efforts are ongoing in other places, including Portland, Oregon.

Howell: I'm taking the exact same technology they use on us. And it's not like I'm trying to dox people. I mean, this is for officers that are essentially breaking the law in terms of their use of force.

Strong: Christopher Howell is a data scientist, and protester.

Howell: And I just wanted to get on the record. Hey, there's another way we could use this. // And really it's more about, you know, keeping the pressure on them to identify themselves as opposed to us having to do it.

Strong: His testimony before the Portland city council about this tool he's working on led to his project getting covered by The New York Times. Since then, he's gotten a fair bit of attention as did the reaction of Portland's mayor, who called the project creepy.

Howell: It's creepy to let them go out and tear gas people night after night // and I, that was really, to me was just a, kind of a mind blowing moment of like, it's creepy that these guys don't have their names. You know, that someone can just dress up like a policeman you know [laugh]

Strong: He says he started the project partly just to see how it would work.

Howell: I mean, there's a definite technological curiosity there... and that sort of, something I can do when it's late at night and it's frustrating to see this stuff, the news or you go on Twitter and see people's, everyone's videos of the protests and okay, well, what can I do? ...and ultimately I would like it not to be necessary. I think they should wear their names in tall letters so that we can not have to use facial recognition on cops to try to figure out who they are.

Strong: Being able to identify an officer isn't just about knowing if someone actually is one... It's extremely difficult to file a complaint against someone without knowing who that person is.

Howell: One of the reasons I started this was because in lawsuits, you can't name like the cop who shoved me. You have to know who it is. They'll just say, well, we don't know who it was. So there's nobody for you to sue and trying to get records on what particular officers did they will not let you go fishing for it. // You need to know so, if you gave me a picture of, you know, here's this cop hitting somebody with a baton I want to, and we're going to figure out, you can see his face. So we're going to figure out who it is, how do we do that? And initially I was more thinking on a reverse image search on a database. Like I'll just collect in all the pictures I can, and we'll be able to link them together so we can say these are all the same person. And then I kind of realized I could do better than that.

Strong: He realized he could build a face ID tool, using images of the officers from the internet.

Howell: So I started looking at their pictures where they are identified or news, you know, going back further and getting like news articles.

Strong: The project is technically easier than building a system meant to identify anyone.

Howell: We know that who the police are, you know, there aren't, there shouldn't be people in riot gear who aren't police officers. So if you can get all of their pictures and that's the real challenge, but then you could have, a less difficult mathematical problem in that sense to identify, okay, we know this person is out of this set. Let's find out which one of them they are...

Strong: He says he's collected thousands of pictures so far...mostly manually from Twitter and news articles on average 15 to 20 or so for each officer he's identified.

Howell: A lot of the initial images were from Twitter. A lot of it was me going into news articles. // I mean, the traffic accident, one is, you know, news story is such a great example. Cause I found a bunch of those. Or like community barbecue and there are uniformed officers there and it names them.

Strong: It's not exactly the same scale as the face ID systems used by police...

Howell: I'm just all doing this right now on a Python script - you know, it's all local on my laptop.

Strong: ...those are usually trained on photo sets of millions or billions.

Howell: Because I do think the, the accuracy is important and, uh, and the number of images is small enough that if I was trying to put my face in there, I don't want to get another Chris Howell.

Strong: But it's just enough images that it seems to do the trick. He says his tool has already been used to help confirm the identity of an officer in a case that's headed to court.

Howell: Somebody brought me a picture I hadn't found and said, help me, you know, you use your system and tell me what it says. And she already knew who she thought it was, but she didn't tell me. And then the top result was the one that was expected.

Strong: As for where this goes next, that's anyone's guess, though he can imagine a day when it might be used in partnership with the city.

Howell: I could see a future, maybe years from now, when things are a bit different where the city makes something like this available and hosts it themselves and makes them take a bunch of pictures so we can have a well-trained thing. So as a like citizen feedback thing. And, and it wouldn't necessarily have to all be negative things, but in a way that it could also be used for complaints. I mean, part of me thinks there's there would be a use they're partnering with the government agencies themselves to say, Hey, we want to be accountable." I don't think that's very realistic in 2020, but I think it, it could be not far away! [laugh]

Strong: Something he doesn't want is to make it open source.

Howell: I don't think it's a great idea for anybody really to have it just be like public facing on a website. Think that's, it's asking for things to go wrong beyond like the really obvious things you could have, people, you know, faking things or intentionally putting in misleading pictures...

[Music]

Strong: Portland's bans on face I-D will take effect in January but Howell's project? It won't be impacted. Public schools, religious institutions, even how facial recognition is used at the airport, such as the way airlines like Delta board its passengers. None of that will be impacted either. Local government agencies won't be able to use it. And it won't be allowed in most public spaces or private spaces that are open to the public, like a shopping mall. Local police will not be allowed to use the technology either though people in their private homes, like Christopher Howell, will be. The bans also don't apply to law enforcement at the state or federal level.

In a way, the use of this surveillance technology is becoming a sort of global arms race between the public and authority figures, both hoping to peel back the cover of anonymity to encourage good behavior. It's hard to see who the winner might be, but certainly the loser is privacy.

In the next episode...

Excerpt: It's not trauma informed to have somebody walk into a facility and say, yes, you can absolutely come in, but let me just take your fingerprints.

Strong: We look at the move to use facial recognition in public housing and homeless shelters.

Excerpt: Facial Recognition is kinder emotionally in that it's passive, it, doesn't touch them and you can capture it more quickly and there's no risk of transmission.

Strong: This episode was reported and produced by me, Tate Ryan-Mosley, Karen Hao and Emma Cillekens. We're edited by Michael Reilly and Gideon Lichfield.

Thanks for listening, I'm Jennifer Strong.

[TR ID]

External Content
Source RSS or Atom Feed
Feed Location https://www.technologyreview.com/stories.rss
Feed Title MIT Technology Review
Feed Link https://www.technologyreview.com/
Reply 0 comments