Article 6NZCY Detroit Alters Facial Recognition Use Rules In Response To Multiple Bogus Arrests

Detroit Alters Facial Recognition Use Rules In Response To Multiple Bogus Arrests

by
Tim Cushing
from Techdirt on (#6NZCY)

All facial recognition tech is flawed. Some offerings may be less flawed than others, but the underlying problems (mainly, the inability to be as accurate when identifying minorities and women) remain.

In Detroit, the problems became problematic fairly quickly and dramatically. Hundreds of US law enforcement agencies utilize facial recognition tech, but the Detroit PD was especially bad at it.

The main problem here wasn't necessarily the software, which was provided by DataWorks Plus. It was the officers using it. Rather than follow internal rules and guidance supplied by DataWorks that made it clear potential matches weren't supposed to be considered probable cause for an arrest, officers decided to go after whatever person the algorithm suggested might be a match.

By the middle of 2020, the Detroit PD and its tech had already been linked to two wrongful arrests. A third followed a couple years later. These are probably not the only false arrests linked to the tech, but these are the three that have generated lawsuits.

One of those lawsuits has resulted in a settlement that alters how the Detroit PD uses the tech, in hopes of reducing this Detroit PD-centric problem, as Kashmir Hill reports for the New York Times.

In January 2020, Robert Williams spent 30 hours in a Detroit jail because facial recognition technology suggested he was a criminal. The match was wrong, and Mr. Williams sued.

On Friday, as part of alegal settlementover his wrongful arrest, Mr. Williams got a commitment from the Detroit Police Department to do better. The city adopted new rules for police use of facial recognition technology that the American Civil Liberties Union, which represented Mr. Williams, says should be the new national standard.

We hope that it moves the needle in the right direction," Mr. Williams said.

Multiple limitations of the tech were exposed in a very short period by Detroit PD officers' reckless use of search results. The first false arrest involved someone who clearly couldn't have been the suspect if officers had bothered to look further than his face. The recording of the suspected arsonist used by investigators showed a suspect with bare arms. The arrestee's arms were covered with tattoos.

Another arrest involved a pregnant woman who was selected as a match by the PD's software, despite clearly being unable (at eight months into her pregnancy) to be capable of carrying out a carjacking. (Or, even if she could, that would have been the most notable thing pointed out by the victim of the crime.)

The third case is the one that has netted this crucial settlement. Robert Williams sued the PD and the arresting officers after he was arrested for a jewelry store robbery he didn't commit. The investigators (I'm using that term lightly) had been involved in other bogus arrests stemming from over-reliance on facial recognition search results. In this case, they used a low-res screengrab from the store's CCTV to run a search, even though the photo was incapable of generating anything but garbage results.

Inx3ROw.png?ssl=1

Because not doing anything will allow this sort of thing to continue, the city of Detroit has agreed to a settlement [PDF] that changes the way its cops are allowed to use facial recognition search results. The most important change is that officers will no longer be able to show images of people surfaced via facial recognition tech to eyewitnesses unless the investigators have some other evidence linking that person to the crime.

There are more stipulations and limitations as well.

The department is also changing how it conducts photo lineups. It is adopting what is called a double-blind sequential, which is considered a fairer way to identify someone. Rather than presenting a six-pack" to a witness, an officer - one who doesn't know who the primary suspect is - presents the photos one at a time. And the lineup includes a different photo of the person from the one the facial recognition system surfaced.

The police will also need to disclose that a face search happened, as well as the quality of the image of the face being searched - How grainy was the surveillance camera? How visible is the suspect's face? - because a poor quality image is less likely to produce reliable results. They will also have to reveal the age of the photo surfaced by the automated system, and whether there were other photos of the person in the database that did not show up as a match.

That should help limit this sort of damage going forward. But maybe the city should take a look at the tech itself, which seems to be more flawed than most. Even the police chief seemed unimpressed back in 2020 when these stories first started surfacing, telling city officials the software misidentified people 96 percent of the time."

Ensuring it's not the only thing being used to obtain arrest warrants will help eliminate some of the margin of error. On the other hand, it might be time to find a different vendor with a more accurate product. Or, better yet, just stop using it at all because, for all the tech involved, it often seems to be no more reliable than your average eyewitness.

External Content
Source RSS or Atom Feed
Feed Location https://www.techdirt.com/techdirt_rss.xml
Feed Title Techdirt
Feed Link https://www.techdirt.com/
Reply 0 comments