Yet Another Study Shows ShotSpotter Can’t Fight Crime Or Get Help To Shooting Victims Faster
This would seem like a truly extraneous nail in the coffin of ShotSpotter deployment in Chicago, but there are far too many city council members still willing to prop up under-performing tech with faith-based arguments. And there's the company itself, which has shifted narratives (along with redoing the company letterhead) over the past several months in hopes of keeping this part of its revenue stream flowing.
Chicago does have a problem with violent crime, a lot of which involves guns and gunshots. ShotSpotter appeared to be a solution. But, after several years of implementation, the city's Inspector General's office decided to look at the data. What it discovered was that the city had been paying millions of dollars a year in exchange for almost no reduction in gun crime or corresponding increase in successful prosecutions of those engaging in gun violence.
Mayor Brandon Johnson decided the city shouldn't continue to pay for a service that provided no actual services. Several council members - perhaps urged on by ShotSpotter's own pleas for its continued existence - fought back. These efforts were expected. And, to their credit, some council members even got creative with the city's parliamentary rules in hopes of reviving the expiring ShotSpotter contract.
As noted earlier, ShotSpotter (now doing business as SoundThinking) fought back against the IG report by claiming the city was using the wrong metric to measure its effectiveness. It claimed the real value was speedier EMS responses to reporting gunshots, not any reduction in crime levels or increase in crime-fighting effectiveness.
It was a weird flex, considering the company deployed most of its marketing muscle (prior to several high-profile failures) claiming ShotSpotter was an essential crime-fighting tool. Now, it's just a thing that might scramble ambulances faster. But even by that metric (one certainly chosen due to the dearth of comparative data in cities where ShotSpotter is used), ShotSpotter still under-performs. Some people have claimed otherwise, using cherry-picked data. But the truth is ShotSpotter is likely no better at saving lives than it is at reducing crime rates.
Here's WBEZ Chicago's Chip Mitchell, speaking to Michael Topper of the Social Science Research Council about the findings of his study [PDF] of ShotSpotter's alleged life-saving abilities:
From your research, how likely is it that ShotSpotter enables police to solve more crimes or to reduce shootings or other gun crimes such as robberies?
What we're finding is that there isn't any evidence that this technology is actually benefiting crime clearance or crime reduction efforts. Because of the cost of these officers arriving slower and not arresting as many perpetrators on 911 calls, we don't have the benefit of gun-related arrests and higher clearance rates across the city.
That's pretty much the same conclusion the city's Inspector General reached a couple of years ago. While ShotSpotter continues to dispute this, this is pretty much the same thing other law enforcement agencies in major cities have discovered when comparing clearance rates to ShotSpotter deployment.
The new twist is this: ShotSpotter isn't getting emergency personnel to shooting scenes faster. Thanks to the gunshot detection system, scare resources aren't often available where they're needed most. And that means ShotSpotter likely isn't saving nearly as many lives as it claims to.
This is explained in more detail in Topper's report:
[R]eallocating resources to gunfire detection changes an officer's time allocation. On one hand, this reallocation could be beneficial-ShotSpotter may frequently place officers closer to locations that foster higher volumes of crime. In this situation, an officer's time of arrival may be reduced. On the other hand, these investigations of previously unreported gunfire may incapacitate officers from attending to reports of other crimes in the form of 911 calls-a lifeline for citizens in distress. In effect, these calls may suffer from increased response times, as officers are busy investigating ShotSpotter detections. Consequently, this may have far reaching implications given the critical importance of rapid response, which has shown to alter the probability of crime clearance and victim injury. Furthermore, response times may affect timely medical treatment, as emergency medical personnel are required to delay their services until police arrive if their safety is compromised.
Then there's the next part of this ugly equation: as economist Michael Topper discovered during his scouring of the available data, ShotSpotter's claims of faster EMS response times appear to have been cherry-picked to present something far more effective than it actually is. At best, the studies ShotSpotter uses to cite are making suppositions based on data sets that are far too small to truly represent the reality of the situation.
Anew analysisby the University of Chicago Crime Lab suggests the shooting fatality rates - the odds a gunshot victim dies of the wounds - are about 4 percentage points lower in areas with ShotSpotter. The analysis suggests the technology likely saves roughly 85 lives per year in Chicago. How does that jibe with your findings?
I took a look at this analysis. It relies on a research design known as regression discontinuity. It could be, for instance, a boundary in the middle of a street. On one side of the boundary are ShotSpotter detectors. On the other side, there are no ShotSpotter detectors. So they are comparing the sides of the boundary and finding that ShotSpotter is possibly saving lives. The main assumption here is that nothing else changes across the boundary except for the ShotSpotter detectors. But we can all agree there are many things that could change across the boundary. Just the next block over could be a lot safer. The other thing that this sort of analysis relies on is having a lot of data on both sides of the boundary. And, while I know that Chicago is more violent than many other U.S. cities, the analysis still requires a lot of data - a lot of gunshots and a lot of shooting victims on both sides of this boundary. So, I think the study needs more vetting before we take these estimates very seriously.
Topper doesn't go so far as to claim ShotSpotter is completely useless. He points out similar research that indicates ShotSpotter isn't worth paying for suffers from many of the same problems his does: namely, a lack of data. But what data does exist (and can be accessed by researchers - ShotSpotter is a private company that has no legal obligation to turn over this data to researchers or public records requesters) doesn't have anything positive to say about ShotSpotter's ability to reduce crime rates or EMS response times.
The financial bottom line is clear: ditching ShotSpotter would free up $10-15 million a year and a whole lot of police officers. And that could mean better response times to reported shootings and EMS scenes requiring a police presence. None of that is guaranteed, no matter how the city ultimately handles the expiring ShotSpotter contract. Having more resources available won't mean much if officers are going to spend their extra free time doing things like chasing Pokemon or frisking every minority person they come across. But, at worst, taxpayers won't be shelling out another $10+ million a year to obtain the same level of under-service they've become accustomed to since ShotSpotter's arrival.