Article 6MH7J Axon Wants Its Body Cameras To Start Writing Officers’ Reports For Them

Axon Wants Its Body Cameras To Start Writing Officers’ Reports For Them

by
Tim Cushing
from Techdirt on (#6MH7J)
Story Image

Taser long ago locked down the market for less than lethal" (but still frequently lethal) weapons. It has also written itself into the annals of pseudoscience with its invocation of not-an-actual-medical condition excited delirium" as it tried to explain away the many deaths caused by its less than lethal" Taser.

These days Taser does business as Axon. In addition to separating itself from its troubled (and somewhat mythical) past, Axon's focus has shifted to body cameras and data storage. The cameras are the printer and the data storage is the ink. The real money is in data management, and that appears to be where Axon is headed next. And, of course, like pretty much everyone at this point, the company believes AI can take a lot of the work out of police work. Here's Thomas Brewster and Richard Nieva with the details for Forbes.

On Tuesday, Axon, the $22 billion police contractor best known for manufacturing the Taser electric weapon, launched a new tool called Draft One that it says can transcribe audio from body cameras and automatically turn it into a police report. Cops can then review the document to ensure accuracy, Axon CEO Rick Smith toldForbes. Axon claims one early tester of the tool, Fort Collins Colorado Police Department, has seen an 82% decrease in time spent writing reports. If an officer spends half their day reporting, and we can cut that in half, we have an opportunity to potentially free up 25% of an officer's time to be back out policing," Smith said.

If you don't spend too much time thinking about it, it sounds like a good idea. Doing paperwork consumes a large amounts of officers' time and a tool that automates at least part of the process would, theoretically, allow officers to spend more time doing stuff that actually matters, like trying to make a dent in violent crime - the sort of thing cops on TV are always doing but is a comparative rarity in real life.

It's well-documented that officers spend a large part of their day performing the less-than-glamorous function of being an all-purpose response to a variety of issues entirely unrelated to the type of crimes that make headlines and fodder for tough-on-crime politicians.

On the other hand, when officers are given discretion to handle crime-fighting in a way they best see fit, they almost always do the same thing: perform a bunch of pretextual stops in hopes of lucking into something more criminal than the minor violation that triggered the stop. A 2022 study of law enforcement time use by California agencies provided these depressing results:

Overall, sheriff patrol officers spend significantly more time on officer-initiated stops - proactive policing" in law enforcement parlance - than they do responding to community members' calls for help, according to the report. Research has shown that the practice is a fundamentally ineffective public safety strategy, the report pointed out.

In 2019, 88% of the time L.A. County sheriff's officers spent on stops was for officer-initiated stops rather than in response to calls. The overwhelming majority of that time - 79% - was spent on traffic violations. By contrast, just 11% of those hours was spent on stops based on reasonable suspicion of a crime.

In Riverside, about 83% of deputies' time spent on officer-initiated stops went toward traffic violations, and just 7% on stops based on reasonable suspicion.

So, the first uncomfortable question automated report writing poses is this: what are cops actually going to do with all this free time? If it's just more of this, we really don't need it. All AI will do is allow problematic agencies and officers to engage in more of the biased policing they already engage in. Getting more of this isn't going to make American policing better and it's certainly not going to address the plethora of long-standing issues American law enforcement agencies have spent decades trying to ignore.

Then there's the AI itself. Everything at use at this point is still very much in the experimental stage. Auto-generated reports might turn into completely unusable evidence, thanks to the wholly expected failings of the underlying software.

These reports, though, are often used as evidence in criminal trials, and critics are concerned that relying on AI could put people at risk by depending on language models that are known to hallucinate," or make things up, as well as display racial bias, either blatantly or unconsciously.

That's a huge problem. Also problematic is the expected workflow, which will basically allow cops to grade their own papers by letting the AI handle the basics before they step in and clean up anything that doesn't agree with the narrative an officer is trying to push. This kind of follow-up won't be optional, which also might mean some agencies will have to allow officers to review their own body cam footage - something they may have previously forbidden for exactly this reason.

On top of that, there's the garbage-in, garbage-out problem. AI trained on narratives provided by officers may take it upon themselves to correct" narratives that seem to indicate an officer may have done something wrong. It's also going to lend itself to biased policing by tech-washing BS stops by racist cops, portraying these as essential contributions to public safety.

Of course, plenty of officers do these sorts of things already, so there's a possibility it won't make anything worse. But if the process Axon is pitching makes things faster, there's no reason to believe what's already wrong with American policing won't get worse in future. And, as the tech improves (so to speak), the exacerbation of existing problems and the problems introduced by the addition of AI will steadily accelerate.

That's not to say there's no utility in processes that reduce the amount of time spent on paperwork. But it seems splitting off a clerical division might be a better solution - a part of the police force that handles the paperwork and vets camera footage, but is performed by people who are not the same ones who captured the recordings and participated in the traffic stop, investigation, or dispatch call response.

And I will say this for Axon: at least its CEO recognizes the problems this could introduce and suggests agencies limit automated report creation to things like misdemeanors and never in cases where deadly force is deployed. But, like any product, it will be the end users who decide how it's used. And so far, the expected end users are more than willing to streamline things they view as inessential, but are far less interested in curtailing abuse by those using these systems. Waiting to see how things play out just isn't an acceptable option - not when there are actual lives and liberties on the line.

External Content
Source RSS or Atom Feed
Feed Location https://www.techdirt.com/techdirt_rss.xml
Feed Title Techdirt
Feed Link https://www.techdirt.com/
Reply 0 comments