FBI Looking To Use Amazon’s Facial Recognition Tech To ‘Recognize’ Stuff That Isn’t People’s Faces
A half-decade ago, Amazon was an emerging player on the facial recognition scene. Its proprietary blend was called Rekognition." At the outset, Amazon was definitely interested in getting it in the hands of as many cops as possible. Documents obtained by the ACLU showed the company was courting law enforcement agencies, seeking to sell them a high-powered facial recognition variant capable of doing things its competition couldn't.
Rekognitioncan identify, track, and analyze people in real time and recognize up to 100 people in a single image. It can quickly scan information it collects against databases featuring tens of millions of faces,according to Amazon.
Like other providers of technology to law enforcement, Amazon kept this under wraps by tying up public agencies with restrictive non-disclosure agreements, agreements law enforcement agencies cited while rejecting public records requests.
Rekognition, as powerful as it was, still suffered from the major flaws inherent in its competitors. It performed much worse when applied to minorities and women, resulting in 28 members of Congress (most of them people of color) being misidentified as wanted criminals during a test run of the product by the ACLU.
This very public failure was soon followed by a string of very public failures (i.e., the killing of Americans, most of them minorities) by law enforcement agents, culminating in the murder of George Floyd by Minneapolis police officer Derek Chauvin, an event that prompted demonstrations across the United States.
Amazon reconsidered its cop-forward position and decided it wasn't going to be part of the problem. In June 2020, it announced it would no longer be giving law enforcement agencies access to its facial recognition tech.
Given that it issued a hands off" notice to law enforcement, it's something of a surprise that it's now providing its Rekognition tech to the FBI, as Jessica Hardcastle reports for The Register.
The FBI plans to use Amazon's controversial Rekognition cloud service to extract information and insights from lawfully acquired images and videos," according to US Justice Department documents.
In its Agency Inventory of AI Use Cases, the DOJliststhe project, code-named Tyr, as being in the initiation" phase for the FBI, which intends to customize and use the technology to review and identify items containing nudity, weapons, explosives, and other identifying information."
That information comes from the DOJ's roundup of its in-progress AI projects [PDF]. And, at first glance, this would appear to violate Amazon's promise to keep this tech out of cop's hands.
But there are several caveats. As Amazon pointed out to The Register in response to a request for comment, it didn't actually say cops couldn't use the tech ever. They just couldn't use it to do the thing they were most likely to use it for had Amazon not restricted that aspect of it.
Amazon has implemented a moratorium on use of Amazon Rekognition's face comparison feature by police departments in connection with criminal investigations. This moratorium does not apply to use of Amazon Rekognition's face comparison feature to help identify or locate missing persons."
So, there's the loophole that law enforcement can use. And Amazon's overall restriction does not apply to government agencies that aren't in the business of law enforcement. However, the FBI is clearly a law enforcement agency and definitely would be interested in deploying another facial recognition option.
But, according to the DOJ document, Amazon's tech is going to be used to recognize" things that aren't human faces.
Amazon Rekognition offers pretrained and customizable computer vision (CV) capabilities to extract information and insights from lawfully acquired images and videos. Currently in initiation phase to customize to review and identify items containing nudity, weapons, explosives, and other identifying information.
Content moderation, but it's the FBI. The document doesn't explain the end goal of this use of the tech. And another project listed in the document suggests Amazon's tech is only part of the process. The other project also involves search content for certain things.
Computer vision algorithms trained using AI techniques are used to classify and identify content in lawfully acquired images and videos to enable a user to quickly find content" of interest in multimedia data. All results are reviewed by a human and no action is taken automatically based on the sole result of the algorithms.
Adding these together and it sure looks like the FBI is trolling the open web looking for evidence of criminal activity. That's a bit worrying if that's what's actually happening. It could be this tech would only be applied to content retrieved from seized devices or whatever, but the potential to convert the internet into an FBI fishing hole remains.
We'll see where this leads. Or, you know, maybe we won't. It all depends on how well the FBI can keep its secrets. And it's not the only concerning thing utilizing unproven tech on the list. The DOJ is also hooking up the ATF to existing ShotSpotter systems run by local law enforcement agencies, adding yet another way for false positives to go horribly wrong. And another ongoing project utilizes AI to scan documents to identify privileged communications between suspects and their lawyers, hopefully to prevent DOJ prosecutors from accidentally accessing these communications.
Amazon is back in the law enforcement business, even if its facial recognition tech isn't being used to search for faces. Then again, it never really left. It just allowed us to engage in our own assumptions about what its moratorium meant. And if we were wrong, well... that's on us.