ShotSpotter Employees Not Only Have The Power To Alter Gunshot Reports, But Do It Nearly 10% Of The Time

What's being presented by ShotSpotter as good news for people who feel they've been wrongly accused, doesn't actually appear to be all that comforting.
ShotSpotter's mic tech and AI combine forces to report possible gunshots to law enforcement customers. It's very hit or miss, he said with all possible puns intended. ShotSpotter says it's nearly 100% accurate and can play an important part in reducing gun crime.
Actual customers say something else:
A 2013 investigation of the effectiveness of ShotSpotter in Newark, New Jersey revealed that from 2010 to 2013, the system's sensors alerted police 3,632 times, but only led to 17 actual arrests. According to the investigation, 75% of the gunshot alerts were false alarms.
97% accuracy? Not what we've seen, says the San Diego PD:
A San Diego Police Department spokesperson told Voice of San Diego that during the four years ShotSpotter had been in use (as of September 2020) officers had made only two arrests responding to an alert and only one of those was directly linked to the alert.
Meanwhile, 72 of the 584 ShotSpotter alerts during that time period were determined to be unfounded, a whopping 25 times higher than the 0.5 percent false positive rate put forth by the company," the Voice of San Diego reported, based on data provided by the city's police department.
Accuracy aside, can it help reduce gun violence and criminal acts linked to fired weapons? Again, the answer is no.
The City of Chicago Office of Inspector General's (OIG) Public Safety section has issued a report on the Chicago Police Department's (CPD) use of ShotSpotter acoustic gunshot detection technology and CPD's response to ShotSpotter alert notifications. OIG concluded from its analysis that CPD responses to ShotSpotter alerts can seldom be shown to lead to investigatory stops which might have investigative value and rarely produce evidence of a gun-related crime.
That doesn't mean the Chicago PD doesn't think the tech is useful. In Chicago, officers still like ShotSpotter despite its inaccuracy because it allows them to do the sorts of things they want to do.
In reviewing ISR [investigative stop report] narratives for mentions of ShotSpotter alerts, OIG also identified 10 ISRs (13.9%) in which reporting officers referred to the aggregate results of the ShotSpotter system as informing their decision to initiate a stop or their course of action during the stop, even when they were not responding to a specific ShotSpotter alert. For example, some officers during the reporting period identified the fact of being in an area known to have frequent ShotSpotter alerts as an element of the reasonable suspicion required to justify the stop. Other officers reported conducting protective pat downs" following a stop because they knew themselves to be in areas where ShotSpotter alerts were frequent.
A new Associated Press report - based on confidential ShotSpotter records shared with the news agency - is full of the sort of good news/bad news that tends to get saddled with noncommittal headlines, like this one: Confidential document reveal key human role in gunshot tech."
Here's what's notable in this report:
[A] confidential ShotSpotter document obtained by The Associated Press outlines something the company doesn't always tout about its precision policing system" - that human employees can quickly overrule and reverse the algorithm's determinations, and are given broad discretion to decide if a sound is a gunshot, fireworks, thunder or something else.
Such reversals happen 10% of the time by a 2021 company account, which experts say could bring subjectivity into increasingly consequential decisions and conflict with one of the reasons AI is used in law-enforcement tools in the first place - to lessen the role of all-too-fallible humans.
The AP is technically correct. ShotSpotter says its tech can do what police can't: be omnipresent with ears at the ready. What it pitches to cop shops is near perfection, a 97% success rate in hearing and locating gunshots.
What's not made immediately clear is the human backstops. This is absolutely essential. Loud noises should not be instantly assumed to be gunshots. Hence the need for trained human employees to sort the possibles" from the confirmed."
But there's a downside to this - one that's just as harmful as some PDs' willingness to treat every suspected gunshot as blanket permission to violate the rights of those who happen to be in the reported vicinity. ShotSpotter's human techs don't just alter reports to distinguish things like a car's backfiring from a suspected criminal's gun firing. They also alter determinations and gunshot locations to better serve the needs of law enforcement agencies that interact with them.
On one hand, we have humans looking for AI errors. On the other hand, we have humans willing to cater to their law enforcement customers. A real land of contrasts sort of situation and one that doesn't exactly inspire more trust in a cop tech company that has routinely overstated the accuracy of its main product.
Unsurprisingly, ShotSpotter execs remain bullish.
ShotSpotter said in a statement to the AP that the human role is a positive check on the algorithm and the plain-language" document reflects the high standards of accuracy its reviewers must meet.
Our data, based on the review of millions of incidents, proves that human review adds value, accuracy and consistency to a review process that our customers-and many gunshot victims-depend on," said Tom Chittum, the company's vice president of analytics and forensic services.
This is undoubtedly true. Human reviewers can make judgment calls the software can't. This can help limit false positives. On the other hand, we've seen evidence that ShotSpotter's human reviewers are not nearly as well-trained as the company claims. Their experts are not really experts. And, in at least two cases, the human reviewers have engaged in the sort of customer service that guarantees repeat government business (altering reports to better fit police narratives) but does little to protect the people who actually pay for these services: residents of cities where the tech has been deployed.
If ShotSpotter's human staffers are altering gunshot reports 10% of the time, it means the software isn't as accurate as the company claims it is. And it likely means they're still altering reports by request for law enforcement agencies which may feel a false positive needs to be treated as an actual positive or feel the detection was too far away from their rights violations to be useful in their post hoc rationalization of their abuses.
Whatever the case, ShotSpotter should be treated with far more skepticism than what's observed in this AP report. The company asserts facts not in evidence, sues journalists for truthfully reporting on its activities, and clearly considers itself to be an essential part of the criminal justice equation. Until the company is willing to let outside experts examine its tech, the company should be treated as part of the problem, rather than a cheap and easy solution to gun crime.