Oh Look, Some Cop Just Got Busted For Abusing Access To Clearview AI
The inevitable is upon us: a police officer has been caught using Clearview AI for non-law enforcement purposes. That wouldn't mean anything if the officer had private access to the most ethically dubious player in the facial recognition tech market. But he didn't. He was using access purchased by his employer, so it wasn't only a violation of department policy, but a clear, non-law enforcement-related violation of the privacy of those on the other end of these searches.
An officer with the Evansville Police Department has resigned following an investigation of misusing the department's A.I. technology.
Evansville Police Chief Philip Smith said Tuesday that Officer Michael Dockery, a five-year member of the police department, resigned before the Police Merit Commission could make a final determination for termination.
The fact that Officer Dockery decided to resign rather than be disciplined (further) or fired is equally unsurprising. If you can get out before the hammer falls, you can just go ply your wares at another law enforcement agency since you won't have anything on your permanent record. (And that's if the new employer cares to look at your permanent record. Most law enforcement either don't bother to check incoming officers' pasts, or just don't consider causes for concern to be cause for concern.)
The more surprising aspect of this incident was how it was discovered. The chief was performing an audit of the software prior to the PD's renewal of its contract. And that's when he came across the improper searches. He suspended Officer Dockery for 21 days, at which point Dockery decided to call it quits.
This is more of the same bullshit we've come to expect from cops who have access to other people's personal information. Officers have been caught running personal searches on drivers license databases and other repositories of personal data collected by government agencies.
Dockery didn't play by the rules established by his employer. But he was pretty much completely aligned with Clearview's ethically dubious marketing tactics in which it encouraged potential customers (including law enforcement agencies) to run personal searches utilizing its AI and millions (now more than 30 billion) of images it had scraped from the web.
[I]n a November email to a police lieutenant in Green Bay, Wisconsin, a company representative encouraged a police officer to use the software on himself and his acquaintances.
Have you tried taking a selfie with Clearview yet?" the email read. It's the best way to quickly see the power of Clearview in real time. Try your friends or family. Or a celebrity like Joe Montana or George Clooney.
Your Clearview account has unlimited searches. So feel free to run wild with your searches," the email continued.
Maybe this seems like a one-off. I guarantee this isn't. This is someone who got caught. Plenty of agencies have access to facial recognition tech. The number of agencies that engage in periodic audits is undoubtedly far less than the number of agencies using the tech.
The only thing anomalous about this is that the agency moved quickly to discipline the officer who violated department policy. Once again, I can guarantee lots of other violations have occurred and at least some of those have been discovered. But a discovery followed by immediate (or any!) discipline is an actual unicorn.
There will be more in the future. And as for (at the moment) former officer Michael Dockery, he'd better hope his next employer is a regression to the mean in terms of police accountability if he wants to keep his job.