Article 6JD2X Facial Recognition Rings Up Another False Arrest, Leading To The Accused Being Brutalized In Jail

Facial Recognition Rings Up Another False Arrest, Leading To The Accused Being Brutalized In Jail

by
Tim Cushing
from Techdirt on (#6JD2X)
Story Image

Facial recognition may be helping law enforcement catch bad guys, but inherent flaws in these systems ensure it's only a matter of time before the AI coughs up yet another unforced error.

That sort of error rate might be acceptable when the AI is doing nothing more than pitching in to refine Google search results or, I don't know, crafting articles for national sports publications. But it's far more harmful when it's helping the government deprive people of their freedom.

These may seem like anomalies when compared to the massive number of facial recognition searches law enforcement performs every day. But each mistake causes falsely accused people to pay a heavy price - something that goes far beyond a short detainment.

Someone merely accused of a crime can expect to have their options for pretty much anything (financial assistance, housing, employment) negatively impacted. Any period of incarceration is far more than an inconvenience. Even short stays can result in lost jobs, evictions, or reputation injury that lasts far longer than whatever time is spent locked up.

This case, covered by Matthew Gault for Vice, involves a private company's use of facial recognition tech. But that tech is used to report people to law enforcement, which means the government quickly brings its power to bear in cases like these, to the detriment of life and liberty.

The details of this case are horrific. What started out as a mistaken armed robbery accusation soon turned into a nightmare for Houston, Texas resident Harvey Murphy.

According to a lawsuit 61-year-old Murphy has filed against Macy's and Sunglass Hut, he was arrested and put into an overcrowded maximum-security jail with violent criminals. While in jail trying to prove his innocence, he was beaten, gang-raped, and left with permanent and awful life-long injuries. Hours after being beaten and gang-raped, the charges against him were dropped and he was released."

All of this because a company told the police, based on artificial intelligence, that you were the one who committed terrible crimes," the lawsuit said.

It was a company that pulled the trigger on this one. An armed robbery on January 22, 2022 saw two men threaten Sunglass Hut employees with guns before walking off with a handful of cash and designer glasses. Another similar robbery at a Macy's resulted in the two companies working together to identify the suspects.

A mug shot of Murphy, taken nearly 40 years ago, was the supposed key to this investigation. The Sunglass Hut loss prevention officer, Anthony Pfleger, reached out to law enforcement claiming he knew who had committed the crimes. This loss prevention officer also allegedly coached a Sunglass Hut employee to identify Murphy in the law enforcement mugshot lineup.

But there was a major flaw in this investigation - one initiated by a private company and closed out by Houston law enforcement.

At the time of the robbery, Murphy was in Sacramento, California. He didn't find out about the robbery, or that he'd been blamed for it, until he went to the DMV to renew his driver's license. He was arrested and held without bond. Despite sending his court appointed lawyer the evidence that exonerated him, he still spent hours in jail.

Law enforcement loves to portray detentions that only last hours" to be so minimally detrimental to people's rights as to not be worth of additional attention, much less the focus of a civil rights lawsuit. But it only takes seconds to violate a right. And once it's violated, it stays violated, even if charges are eventually dropped.

For Murphy, it only took hours for him to be physically and sexually assaulted by other inmates and detainees. And it only to one false match to put him through this hell, despite him being more than 2,000 miles away when the robberies occurred.

And that's the problem with this tech. It will always be able to ruin someone's life because it can't be trusted to deliver accurate matches, especially when it comes to women and minorities. Private companies and law enforcement agencies can point to low false positive percentages all they want, but at the end of the day, people pay a significant, extremely real price for things dismissed as acceptable error rates.

External Content
Source RSS or Atom Feed
Feed Location https://www.techdirt.com/techdirt_rss.xml
Feed Title Techdirt
Feed Link https://www.techdirt.com/
Reply 0 comments