Article 6YN52 Flock Safety’s Feature Updates Cannot Make Automated License Plate Readers Safe

Flock Safety’s Feature Updates Cannot Make Automated License Plate Readers Safe

by
Mike Masnick
from Techdirt on (#6YN52)
Story Image

Two recent statements from the surveillance company-oneaddressing Illinois privacy violationsand anotherdefending the company's national surveillance network-reveal a troubling pattern: when confronted by evidence of widespread abuse, Flock Safety has blamed users, downplayed harms, and doubled down on the very systems that enabled the violations in the first place.

Flock's aggressive public relations campaign to salvage its reputation comes as no surprise. Last month, wedescribedhow investigative reporting from 404 Media revealed that a sheriff's office in Texas searched data from more than 83,000 automated license plate reader (ALPR) cameras to track down a woman suspected of self-managing an abortion. (A scenario that may have been avoided, it's worth noting, had Flock taken action when they were first warned about thisthreat three years ago).

Flock calls the reporting on the Texas sheriff's office purposefully misleading," claiming the woman was searched for as a missing person at her family's request rather than for her abortion. But that ignores the core issue: this officer used a nationwide surveillance dragnet (again: over 83,000 cameras) to track someone down, and used her suspected healthcare decisions as a reason to do so. Framing this as concern for her safety plays directly intoanti-abortion narrativesthat depict abortion as dangerous and traumatic in order to justify increased policing, criminalization, control-and, ultimately, surveillance.

As if that weren't enough, the company has also come under fire for how its ALPR network data is being actively used to assist in mass deportation. Despite U.S. Immigration and Customs Enforcement (ICE) having no formal agreement with Flock Safety, public records revealed more than 4,000 nation and statewide lookups by local and state police done either at the behest of the federal government or as an informal' favor to federal law enforcement, or with a potential immigration focus." The network audit dataanalyzed by 404exposed aninformal data-sharing environmentthat creates an end-run around oversight and accountability measures: federal agencies can access the surveillance network through local partnerships without the transparency and legal constraints that would apply to direct federal contracts.

Flock Safety is adamant this is not Flock's decision," and by implication, not their fault. Instead, the responsibility lies with each individual local law enforcement agency. In the same breath, they insist that data sharing is essential, loudly claiming credit when the technology isinvolved in cross-jurisdictional investigations-but failing to show the same attitude when that data-sharing ecosystem is used to terrorize abortion seekers or immigrants.

Flock Safety: The Surveillance Social Network

In growing from a 2017 startup to a $7.5 billion company serving over 5,000 communities," Flock allowed individual agencies wide berth to set and regulate their own policies. In effect, this approach offered cheap surveillance technology with minimal restrictions, leaving major decisions and actions in the hands of law enforcement while the company scaled rapidly.

And they have no intention of slowing down. Just this week, Flock launched its Business Network, facilitating unregulated data sharing amongst its private sector security clients. For years, our law enforcement customers have used the power of a shared network to identify threats, connect cases, and reduce crime. Now, we're extending that same network effect to the private sector," Flock Safety's CEOannounced.

The company is building out a new mass surveillance network using the exact template that ended with the company having to retrain thousands of officers in Illinois on how not to break state law-the same template that made it easy for officers to do so in the first place. Flock's continued integration of disparate surveillance networks across the public and private spheres-despite the harms that have already occurred-is owed in part to the one thing that it's gotten really good at over the past couple of years: facilitating a surveillance social network.

Employing marketing phrases like collaboration" and force multiplier," Flock encourages as much sharing as possible,going as far as to claimthat network effects can significantly improve case closure rates. They cultivate a sense of shared community and purpose among users so they opt into good faith sharing relationships with other law enforcement agencies across the country. But it's precisely that social layer that creates uncontrollable risk.

The possibility of human workarounds at every level undermines any technical safeguards Flock may claim. Search term blocking relies on officers accurately labeling search intent-a system easily defeated by entering vague reasons like investigation" or incorrect justifications, made either intentionally or not. And, of course, words like investigation" or missing person" can mean virtually anything, offering no value to meaningful oversight of how and for what the system is being used. Moving forward, sheriff's offices looking to avoid negative press can surveil abortion seekers or immigrants with ease, so long as they use vague and unsuspecting reasons.

The same can be said for case number requirements, which depend on manual entry. This can easily be circumvented by reusing legitimate case numbers for unauthorized searches. Audit logs only track inputs, not contextual legitimacy. Flock's proposed AI-driven audit alerts, something that may be able to flag suspicious activity after searches (and harm) have already occurred, relies on local agencies to self-monitor misuse-despite theirdemonstrated inabilityto do so.

And, of course, even the most restrictive department policy may not be enough. Austin, Texas,hadimplemented one of the most restrictive ALPR programs in the country, and the programstill failed: the city's own audit revealed systematic compliance failures that rendered its guardrails meaningless. The company's continued appeal to local policies" means nothing when Flock's data-sharing network does not account for how law enforcement policies, regulations, and accountability vary by jurisdiction. You may have a good relationship with your local police, who solicit your input on what their policy looks like; you don't have that same relationship with hundreds or thousands of other agencies with whom they share their data. So if an officer on the other side of the country violates your privacy, it'd be difficult to hold them accountable.

ALPR surveillance systems are inherently vulnerable to both technical exploitation and human manipulation. Thesevulnerabilitiesare not theoretical-they representreal pathwaysfor bad actors to access vast databases containing millions of Americans' location data. When surveillance databases arebreached, the consequences extend far beyond typical data theft-this information can be used to harass,stalk, or even extort. The intimate details of people's daily routines, their associations, and their political activities may become available to anyone with malicious intent. Flock operates as a single point of failure that can compromise-and has compromised-the privacy of millions of Americans simultaneously.

Don't Stop de-Flocking

Rather than addressing legitimate concerns about privacy, security, and constitutional rights, Flock has onlypromised updatesthat fall short of meaningful reforms. These software tweaks and feature rollouts cannot assuage the fear engendered by the massive surveillance system it has built and continues to expand.

Flock's insistence that what's happening with abortion criminalization and immigration enforcement has nothing to do with them-that these are just red-state problems or the fault of rogue officers-is concerning. Flock designed the network that is being used, and the public should hold them accountable for failing to build in protections from abuse that cannot be easily circumvented.

Thankfully, that's exactly what's happening: cities likeAustin,San Marcos,Denver,Norfolk, andSan Diegoare pushing back. And it's not nearly as hard a choice as Flock would have you believe:Austinitesare weighing the benefits of a surveillance system that generates a hitless than 0.02% of the timeagainst the possibility that scanning 75 million license plates will result in an abortion seeker being tracked down by police, or an immigrant being flagged by ICE in a so-called sanctuary city." These are not hypothetical risks. It is already happening.

Given how pervasive, sprawling, and ungovernable ALPR sharing networks have become, the only feature update we can truly rely on to protect people's rights and safety is no network at all. And we applaud the communities taking decisive action to dismantle its surveillance infrastructure.

Follow their lead: don't stopde-flocking.

Originally published to the EFF Deeplinks blog.

External Content
Source RSS or Atom Feed
Feed Location https://www.techdirt.com/techdirt_rss.xml
Feed Title Techdirt
Feed Link https://www.techdirt.com/
Reply 0 comments