Cities Looking To Dump ShotSpotter Since It's Barely More Useful Than Doing Nothing At All

from the drop-it-like-it's-shot dept

Tech that supposedly detects gunshots has been deployed in multiple cities across the nation with the intent of providing faster response times to possible violence and to give investigators heads up where illegal activity may have occurred. The tech has some pretty serious problems, though.

For one, it cannot reliably detect gunshots.

A 2013 investigation of the effectiveness of ShotSpotter in Newark, New Jersey revealed that from 2010 to 2013, the system’s sensors alerted police 3,632 times, but only led to 17 actual arrests. According to the investigation, 75% of the gunshot alerts were false alarms.

The AI can “hear” the percussive noise and attempt to determine whether or not it’s an actual gunshot. It could be a backfire or fireworks or some other noise distinguishably louder than the ambient noise at the detector’s location. Far too often (and far more often than ShotSpotter claims), the guesses are wrong.

Out of Fall River’s 51 ShotSpotter activations in 2017, 21 have been false alarms, a 41 percent error rate. The sensors often report loud noises such as car backfires and fireworks as gunshots.

Dupere said there have been another 15 ShotSpotter activations this year that police later determined were unfounded because the responding officers found no evidence of gunfire or any witnesses to corroborate that they had seen or heard gunshots.

More disturbingly, the determinations made by the software can apparently be overridden if investigators truly want a shot to be spotted by tech like that offered by ShotSpotter. ShotSpotter personnel have altered AI judgment calls to fit police narratives. Worse, they’ve also moved detected shots from one location to another to better fit law enforcement’s theory about who was involved in a shooting and where it happened.

ShotSpotter is an investigative tool, but it’s a particularly malleable one. Investigators and officers seem pleased ShotSpotter personnel are willing to alter records to better suit their theories and narratives. But that’s completely at odds with the ideal of evidence introduced in criminal cases, which is supposed to be free of bias and deliberate manipulation. When (criminal) history can be rewritten on the fly, the facts of the case are no longer factual.

These problems are exaggerated when law enforcement assumes any noise originating from ShotSpotter detectors is a gunshot and sends officers in to harass the locals until something turns up. Like almost every other tech advancement deployed by cops (especially “predictive policing,” but there are others), the deployed tech tends to serve pre-existing biases held for decades by the law enforcement community: that minorities are inherently more “dangerous” than the white folks that tend to be overrepresented in law enforcement agencies.

Cities and police departments are loath to disclose the locations of their ShotSpotter sensors, but through public records requests Motherboard also obtained years of data from Kansas City, Missouri; Cleveland, Ohio; and Atlanta, Georgia showing where ShotSpotter sensors generated alerts—a proxy for the general location of the sensors.

In all four cities, the data shows that the sensors are also placed almost exclusively in majority Black and brown neighborhoods, based on population data from the U.S. Census.

So, there’s the bigotry angle. This is on top of the evidence-faking issues. And then there are the limitation of the tech, which makes it an unwise investment for cities that actually want to do something about violent crime, rather than just let cops engage in biased policing.

The San Diego City Council was scheduled to vote July 27 on a $1.15 million renewal of its ShotSpotter contract, but city officials withdrew the item from the agenda after dozens of residents submitted public comments opposing the contract.

“ShotSpotter is a perfect example of the patronizing approach to public safety in Black and brown neighborhoods—this idea that we know what’s best for Black and brown community members,” Khalid Alexander, president of Pillars of the Community, told Motherboard. Pillars is an organization that opposes the over-policing of Black and brown residents, and is based in San Diego’s District 4—the only area of the city where ShotSpotter is located.

“I haven’t heard one person in District 4 … come out and say that the ShotSpotter technology is something needed to keep us safe,” Alexander said.

Unsurprisingly, ShotSpotter says the opposite:

In a statement emailed to Motherboard, ShotSpotter claimed that local communities have told them the opposite, and that it would be “false and misleading” to describe the activists as representing the larger population. “As the communities who suffer most from gun violence, they recognize and appreciate ShotSpotter’s role helping police departments combat gun violence,” the company wrote.

While there’s certainly some confirmation bias in play here, it’s safe to say it’s more than just a land of contrasts out there. But the only party that really has any money at stake is ShotSpotter, and self-preservation is a far stronger motivator than simply feeling underserved by law enforcement agencies that have decided to outsource serving and protecting to passive sensors and artificial intelligence. Taxpayer frustration may touch on fiscal issues, but since it’s an aggregate fund, unhappy citizens tend to be less motivated by their bottom lines than companies with multi-million dollar contracts on the line.

And maybe $1.15 million/year seems like a drop in the bucket when it comes to police department budgets. Residents are sick of ShotSpotter and its ability to send officers expecting violence flooding into communities with guns akimbo. In Chicago, the contract now in danger of going unrenewed is $33 million, which is a very tangible hit to the bottom line of a company that early last year recorded record quarterly revenue of a little over $10 million.

Why keep paying for something that isn’t doing anything much more than allowing alarmists to claim that violence is on the rise and the only cure is more cop violence? San Diego might not be paying much for its limited rollout of ShotSpotter, but even its small sample set is a complete disappointment.

A San Diego Police Department spokesperson told Voice of San Diego that during the four years ShotSpotter had been in use (as of September 2020) officers had made only two arrests responding to an alert and only one of those was directly linked to the alert.

Meanwhile, 72 of the 584 ShotSpotter alerts during that time period were determined to be unfounded, “a whopping 25 times higher than the 0.5 percent false positive rate put forth by the company,” the Voice of San Diego reported, based on data provided by the city’s police department.

Thirty-six times more false positives than useful investigative leads. $4.60 million (over four years at the going rate) should buy more help than that. It is only two arrests better than having nothing at all, which is so far below rounding error as to be laughable.

Even if the tech improves as time goes on, we still need to ask why we even need it. So far, all it’s done is make ShotSpotter more money and justify biased policing efforts by law enforcement agencies willing to make use of any new development that allows them to keep doing the things they’ve always done. Taxpayers shouldn’t be asked to fund programs that set cops up for failure. And they certainly shouldn’t be expected to pay for the dubious privilege of being subjected to a system more capable of altering criminal evidence than making neighborhoods safer.

Filed Under: , , ,
Companies: shotspotter

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Cities Looking To Dump ShotSpotter Since It's Barely More Useful Than Doing Nothing At All”

Subscribe: RSS Leave a comment
25 Comments
This comment has been deemed insightful by the community.
That One Guy (profile) says:

'Our tech is vital(to our continued existence as a company)!'

Arguably ShotSpotter is worse than nothing as at least nothing doesn’t give you false alarms and fraudulent ‘evidence’ that could land innocent people in jail or prevent guilty people from going to jail.

"As the communities who suffer most from gun violence, they recognize and appreciate ShotSpotter’s role helping police departments combat gun violence,” the company wrote.

‘You wouldn’t know them though, the communities are all over in Canada…’

Yeah, I’d need a whopper of a [Citation Needed] to back that one up and of vital importance would be whether or not those communities know just how bad the tech and company is at it’s purported job.

Anonymous Coward says:

Re: Car Backfire, what year is it?

I wouldn’t say I constantly hear them, but three times in the last couple weeks I’ve heard one. One was so loud I looked around for the bomb/gas line explosion. One was from a Civic Type R that wasn’t more than a couple years old.

Also the average age of vehicles in the US is 11.8 years. In some states it stretches up to 16 years. Combine that with a lack of emissions testing and there are plenty of cars making very loud bangs. Many years ago I may have popped the muffler off my 1994 Subaru Legacy making it crackle and pop like a race car while taking over 10 seconds to reach 60 MPH.

Scary Devil Monastery (profile) says:

Re: Car Backfire, what year is it?

Children’s toys, firecrackers, movies, podcasts, youtube videos, radio shows, tire blowouts, people dropping things; The acoustic signature of a "gunshot" is just the signature of violently displace air.

I can only imagine chinese new years, when "shotspotter" will be reporting that World War 3 just broke out in certain neighborhoods and cops show up ready to reenact My Lai.

Anonymous Coward says:

Nothing is ever one thing.

In all four cities, the data shows that the sensors are also placed almost exclusively in majority Black and brown neighborhoods, based on population data from the U.S. Census.

So, there’s the bigotry angle.

So bigotry is automatic?

Which areas have the lowest per capita income?
Which areas had the most gun-related crime reported before they installed ShotSpotter?
Is there significant overlap between those two charts?
When you want to install "miracle shot detection technology", where do you place it?

I think you fell for lazy journalism. The implication "this is a result of racism" sells more clicks than "this is a result of income inequality" or "this is a result of education inequality", even when all three are equally true and reinforce each other. Feel free to substitute your preferred "cause of crime", as that too is likely to be true.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re: Nothing is ever one thing.

Which areas had the most gun-related crime reported before they installed ShotSpotter?

The areas where cops are already predisposed to claim and write up gun-related crime.

I think you fell for the, "prejudiced shit doesn’t already happen and it takes active, conscious bigotry to do racist shit" story.

Anonymous Coward says:

Re: Nothing is ever one thing.

The education and income equality is largely caused by systemic racism so it is not inaccurate to use the catch all term racism instead of listing out every way racism has harmed minority communities.

There is plenty of writing on how the deployment of "data based" policing is heavily influenced by the people deploying it and then manipulated after its deployment. The Intercept was Even able to find LA cops who would speak about how trash of a program COMPSTAT is. The same issues with that system are also present with ShotSpotter.

Scary Devil Monastery (profile) says:

Re: Nothing is ever one thing.

"So bigotry is automatic?"

I think you need to learn what "bigotry" means.

Racism means "bias against, based on race". If disadvantaged neighborhoods exist which disproportionately contain members of a certain ethnic minority then that’s systematic racism right from that point onwards.

If an algorithm then places surveillance gear primarily among brown people then yes, the end result is racist. As it will be any time you select a given population sample and find ethnicity to determine the proportionate makeup.

ECA (profile) says:

A system that could work, except

You can make this work Fairly well, with consideration.
YOU DONT NEED another company to do it.
That there are police In the area to tract it, not 5 miles away racing to the location.
The biggest headache in this is a Canyon of buildings, echoes bouncing here and there and you havnt tested all the ramifications of the echo affects.

This comment has been deemed insightful by the community.
Anonymous Coward says:

$4.6 Million

…would employ 145 people at $15/hour for one year.
…would cover the daycare of 300 children.
…would pay the 4-year tuition to UC San Diego for 80 students.
…equals the compensation of 50+ Social Workers
…could have purchased 100 tiny houses for the homeless

But they spent it on junk technology to arrest one person.

Scary Devil Monastery (profile) says:

Re: $4.6 Million

"But they spent it on junk technology to arrest one person."

Because arresting a potential criminal is more important than any number of people dying because the government nanny wasn’t around. What are you, some bleeding-heart liberal? Smart people willingly pay for a bunch of cops to show up and shoot someone who was probably a bad guy, but paying for others to go to hospital? That’s theft. Besides, the kid should have known better than to play with toy guns in the park.

/s because this is how oh so many morons truly think…

nasch (profile) says:

Re: Re: Re: False positive rate

To be absolutely fair, a 5‰ false positive rate

Is 10 times higher than the company claimed. Unless that weird percent symbol makes that mean 0.5%.

is entirely compatible with 72/584 reports being false alarms.

72/584 is .12, or a 12% false positive rate. As mentioned, 25 times worse than advertised.

It just means that out of 200 bangs that aren’t gunshots, it messes up on one of them.

That would be a 0.5% false positive rate, which is what was claimed but not what happened.

From the other reply:

so is mine but…

Your math is not only wrong but getting worse. 5% is 5/100 or 1 in 20.

Scary Devil Monastery (profile) says:

Re: False positive rate

"To be absolutely fair, a 5‰ false positive rate is entirely compatible with 72/584 reports being false alarms."

…you fail math fundamentals. 5% false positive not only means 5% of all the reports are false positive. It means 5% of any sound remotely resembling a gunshot has been falsely reported as a gunshot.

Backfiring cars, fireworks, toy guns, crime shows on netflix…any sudden air displacement generating a noise within the range being listened for.

Since we don’t know how many loud noises were made within earshot of shotspotter we can’t get an accurate percentage rate to begin with, but what the company has issued is certainly not covering the false positives generated in vivo.

Anonymous Coward says:

WHAT COULD GO WRONG!

i have seen many cases where some gets falsely arrested for nothing or someones house gets UNLAWFULLY searched! then we have the blue lies mafia changing the data to fit there report!
when this tech is the right way. you would be lucky to an arrest 1 out of 500 times! when abused you get more lawsuits then what the tech costs!

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...