by Tim Cushing on (#5GYR2)
On January 9, 2020, facial recognition tech finally got around to doing exactly the thing critics had been warning was inevitable: it got the wrong person arrested.Robert Williams was arrested by Detroit police officers in the driveway of his home. He was accused of shoplifting watches from a store on October 2, 2018. The store (Shinola) had given Detroit investigators a copy of its surveillance tape, which apparently was of little interest to the Detroit PD until it had some facial recognition software to run it through.This was the dark, grainy image the Detroit PD felt was capable of returning a quality match:That picture is included in Williams' lawsuit [PDF] against the Detroit Police Department. Even in the best case scenario, this picture should not have been uploaded to run a search against. It's low quality, poorly-lit, and barely shows any distinguishing facial features.What makes it worse is that all facial recognition AI -- across the board -- performs more poorly when attempting to identify minorities. That's the conclusion reached by an NIST study of 189 different algorithms. It's not just some software. It's all of it.The Detroit PD chose to run with that photo. Then it decided the search results it had were close enough to probable cause to effect an arrest, even though the software used stated clearly search results should not be used this way. The search was performed by the Michigan State Police from the grainy image submitted by the Detroit PD. A report was returned but investigators were cautioned against trying to turn this into probable cause: