Should Miami-Dade Police Spend $5 Million on a Gunshot Detector? | Miami New Times
Navigation

Should Miami-Dade Police Spend $5 Million on a Gunshot Detector They Abandoned Before?

Last week, Miami-Dade County Police Director Juan Perez announced in a press conference that the department will spend $2.6 million over the next five years, and possibly another $3 million in the following five, on ShotSpotter, a police technology that uses microphones to listen for gunshots. Perez did not mention...
Share this:
Last week, Miami-Dade County Police Director Juan Perez announced in a press conference that the department will spend $2.6 million over the next five years, and possibly another $3 million in the following five, on ShotSpotter, a police technology that uses microphones to listen for gunshots. Perez did not mention during his talk with the media that MDPD had tried out ShotSpotter once before, and abandoned using it in 2013 after finding it didn't actually help reduce gun crime. New Times has asked the department why it did not disclose its first failed experiment with ShotSpotter before requesting millions in public money to buy it again.

But, in an interview with New Times, ShotSpotter CEO Ralph Clark defended Miami-Dade's first experiment with his technology, as well as the department's decision to spend millions to try out his product a second time.

"It's a way to connect to a community in a way you haven't connected before," he says. "Gun violence is significantly underreported. Guns are fired, and people don’t call."

He suggested Miami-Dade's experiment with ShotSpotter may have failed the first time around because the department did not use it properly. He says MDPD deployed it in too small an area to be effective. He says the technology works best when deployed across an entire town, and that MDPD had only used it in a few square miles. With a changed plan in place, Clark said the microphones should potentially work to help reduce gun crime.

But, he said, ShotSpotter would likely not lead to many "actionable" results, like increased arrests. This then begs an important question: If the technology does not yield decisive results, costs millions, and was abandoned once before, is it worth spending a minimum of $2.6 million to try it out a second time?

There has been a disturbing level of gun-violence this summer. Multiple children have been killed by stray bullets. Certainly, MDPD should try to reduce gun violence.

But ShotSpotter has existed for the last 20 years, and whether it deters gun crime is still a matter of scholarly debate. It's used in more than 90 cities around the country — the tech listens for the sounds of gunshots, and then those gunshots light up on a map. In most cases, departments dispatch officers to the scenes where ShotSpotter says guns were fired.

ShotSpotter's proponents argue it helps reduce gun crime, since gunfire is underreported in most inner-city areas. But some police departments and media organizations say the technology (A) has historically reported a large number of false-positives, and (B) does not directly lead to many arrests or prosecutions.

Three years ago, public radio station WNYC reported that 75 percent of the gunshots ShotSpotter reported in Newark since 2010 have been false alarms. The radio station also said that of 3,632 alerts that Newark Police received, just 17 shooters were arrested on-scene, though ShotSpotter openly argues it is not a tool to catch perpetrators in-the-act.

The Center for Investigative Reporting's Reveal radio show delved deep into San Francisco's ShotSpotter data, and found similar results. Of 3,000 ShotSpotter alerts the department received in 2.5 years, just two alerts led to arrests. One of those incidents was not even related to gunfire: Instead, police arrived to find a drunk man with outstanding warrants. Reveal also reported that evidence gleaned from ShotSpotter played a "minimal role" in subsequent prosecutions.

Most departments do say that ShotSpotter helps them get to the scenes of shootings faster. But some have questioned whether it's worth spending millions upfront (and then tens of thousands in annual fees), for a slight increase in reaction time.

Clark said Miami-Dade had previously been given a "loaner model" to test if it liked ShotSpotter. An MDPD spokesperson later told New Times that the technology produced too many false positives for them to continue using it.

ShotSpotter's "success in directly leading to the apprehension of individuals involved in shooting incidents [was] minimal," MDPD told New Times in 2014. MDPD said ShotSpotter sent county cops to 1,000 suspected shootings in 2012, and only 50 of those shootings turned out to be real.

Clark, the CEO, argues that Miami residents shouldn't read too much into that data. He says police departments are often not set up to link evidence — like shell casings picked up from the scene of a crime — directly back to ShotSpotter.

"Some very forward-thinking police departments have some things coded to say, 'If it were not for ShotSpotter, 'X' would not have happened," he says. But he says that is far from the case in most departments. "It's a little bit frustrating for me. I would love if all customers did that. If they're getting great outcomes, I wish they would organize themselves to report these outcomes. But that's not the way agencies are organized."(He said Broward County, which also abandoned ShotSpotter after a trial run, had done a "crappy job" using the product.)

ShotSpotter's real value, he says, is "getting cops to dots, and getting them out of their cars" to walk around and talk to the community after shots are fired. Perhaps some shell casings from a ShotSpotter alert would lead to the apprehension of a serial shooter, he suggested.

But there is little independent data that shows major decreases in gun crime , overall, in cities that use ShotSpotter.

A study ShotSpotter commissioned in 2011 said the program helped departments get to shootings faster, improve detective work, and keep better data on gun crime. But ShotSpotter itself was allowed to hand-select the study's participants, and there were only seven respondents. "Command staff feel that ShotSpotter is at least partially responsible for [a] downward trend [in shootings], however they note that non-gunfire-related crimes in their cities had gone down, and this is not credited to ShotSpotter," the study says.

Among independent analysts, ShotSpotter's efficacy is still a matter of debate: The U.S. Audio Engineering Society released a 2015 report which said the jury was still out on whether the technology helps cut down on gunshots.

Peter Scharf, a criminologist at Tulane University, also told the New York Times in 2012 that it still isn't clear how useful ShotSpotter really is.

"Whether this will be seen long-term as a short-term law enforcement fad or fundamental to the way police work, that, I think, is the question,” he said. “I don’t think the effectiveness or efficiency arguments have been settled quite yet.”

The Times also mentioned at least one instance in which ShotSpotter microphones picked up the sound of a public argument following a shooting. That audio evidence was later entered into trial, but that incident appears to be an outlier. The American Civil Liberties Union said that ShotSpotter audio recordings had been used in other court cases as well, but said they were "not losing sleep" over the technology.

ShotSpotter publishes a host of data online touting its efficacy. A "National Gunfire Index" from March 2015 says that in the 46 cities where the technology was deployed for multiple years, roughly 75 percent of those cities reported reductions in gunfire. In that time period, the median reduction in gunfire rates in 2015 was 12.8 percent, and six cities saw gunfire reductions of 30 percent or more.

But remarkably, the index does not say whether those gunfire reductions were directly attributable to ShotSpotter. Nor does the study control for any other changes in a department's patrol strategy, like altered routes, added cops, or public-service campaigns.

ShotSpotter's self-commissioned 2011 study confirmed that false-positives were a large problem for many departments. To combat those false-positives, the company began monitoring all ShotSpotter alerts themselves, before those alerts were sent to police departments. This, Clark said, would allow seasoned ShotSpotter technicians to distinguish between real and fake gunfire.

Clark said this the same technology that Miami-Dade police used from 2011 to 2013. It's unclear why or how MDPD's new go-round would be different.

He said Miami-Dade should use ShotSpotter to get more cops out in the community and talking to residents when guns are fired.

"It would be great if, after you’ve secured the scene, you knock on doors, and ask if people they're okay," Clark says. "Like, 'We got a ShotSpotter alert, wanted to check in and make sure you’re okay?'"

But police-reform critics have been advocating that type of policing for years. Reform advocates say departments should be capable of getting cops out of their cars and interacting with the public for little — or no — money.

Regardless, the county's decision to green-light up to $5.6 million for ShotSpotter came with little scrutiny. (Especially when one considers that the county declined to spend just $100,000 to revive the Civilian Investigative Panel last week.) Even if ShotSpotter had worked well for MDPD in the past, the fact that multiple county commissioners told New Times they had no idea the county had used the technology once before is unacceptable. A $5.6 million investment in controversial technology should, by all accounts, have been debated more.
BEFORE YOU GO...
Can you help us continue to share our stories? Since the beginning, Miami New Times has been defined as the free, independent voice of Miami — and we'd like to keep it that way. Our members allow us to continue offering readers access to our incisive coverage of local news, food, and culture with no paywalls.