Posted by & filed under Ethical issues.

A ShotSpotter analyst at a workstation

ShotSpotter’s incident-review room is like any other call centre.

Analysts wearing headsets sit by computer screens, listening intently.

Yet the people working here have an extraordinary responsibility.

They make the final decision on whether a computer algorithm has correctly identified a gunshot – and whether to dispatch the police.

Making the wrong call has serious consequences.

ShotSpotter has garnered much negative press over the last year. Allegations range from its tech not being accurate, to claims that ShotSpotter is fuelling discrimination in the police.

In the wake of those negative news stories, the company gave BBC News access to its national incident-review centre.

ShotSpotter is trying to solve a genuine problem.

“What makes the system so compelling, we believe, is a full 80-95% of gunfire goes unreported,” chief executive Ralph Clark says.

Source: BBC Technology News

Date: November 26th, 2021



  1. There is often a claim that technology provides less bias in a situation, or that it fuels bias. How might technology like this do either (or both) of these?
  2. In what ways could this technology be adapted for other uses?

Leave a Reply

Your email address will not be published.