The decision by lawmakers in Worcester, Massachusetts, last Tuesday to approve local police’s use of software that uses artificial intelligence to direct patrol routes drew quick criticism from local and statewide civil rights groups, which said that the new platform could exacerbate racial biases and privacy concerns that often accompany algorithmic surveillance tools.
The software, ShotSpotter Connect, will join a portfolio that also includes ShotSpotter’s gunshot-detection devices, which have been mounted on utility poles and building roofs in Worcester since 2014. The new purchase came after City Manager Edward M. Augustus Jr. asked the city council for funds to buy software that incorporates data analysis and AI models that tell police where to go to prevent crime before it happens, not just after the sound of gunfire.
The crime-forecasting technology is currently in use in just five cities across the U.S., including Chicago and Savannah, Georgia, though Worcester officials like Augustus and Police Chief Steven Sargent have been optimistic about its potential to deter crime in the central Massachusetts city.
“Among the many benefits, these improvements will allow us to be more efficient in policing deployments by directing officers to the right place at the right time,” Sargent wrote in a letter to Augustus in January. “The visibility of officers also helps to stabilize neighborhoods and addresses the fears of those living in the area. We look forward to incorporating these tools into our daily operations and we will monitor our progress with implementing the program.”
The city will spend nearly $150,000 incorporating the new software, while also adding more gunshot-detection devices. At a city council meeting in January, Augustus argued for the new platform by explaining to council members that the software would only automate the analysis of crime data that’s already being used to adjust where and when patrols are scheduled.
But relying on automated data analysis could leave the city prone to biases in the algorithms ShotSpotter uses, a coalition of 19 civil rights organizations argued in a letter sent to council members before the vote. In particular, the groups expressed disappointment after Augustus’ call last month for a ban on facial recognition technology within city government.
“We appreciate the City Manager’s proposal to ban artificial intelligence-driven facial recognition software and ask that City Councilors, Worcester Police Department leaders and the City Manager see the same red flags, concerns and racial biases in ShotSpotter Connect,” wrote the groups, which included the American Civil Liberties Union of Massachusetts and the local branches of the NAACP and Democratic Socialists of America.
Racial and gender biases in government algorithms have increasingly concerned activists and some public officials over the past several years. Several cities in Massachusetts, including Boston and Cambridge, have already successfully banned government use of facial recognition with help from groups like the ACLU, citing reliability and privacy concerns.
Despite temporary moratoriums in government though, the technology is here to stay in the private sector. But research from the National Institute of Standards and Technology conducted in 2019 revealed facial recognition algorithms, generally, misidentify non-white people far more often than they do white people, especially white men. Last year, a pair of wrongful arrests were made in Detroit off of the basis of facial recognition data, after which Detroit Police Chief James Craig admitted the system misidentifies people 96% of the time.
Cities like Pittsburgh and New York City have also taken steps to clean up implicit biases in other government algorithms, launching a task force to review whether automated processes for housing and policing inherently limit the opportunities for minorities and low-income residents.
In Worcester, the groups urging the city to eliminate algorithmic biases argued that its embrace of ShotSpotter Connect runs counter to that goal.
“If the goal is to eliminate structural racism and implicit biases, implementing ShotSpotter Connect software is the wrong choice,” their letter read.
The coalition asked that the $150,000 instead be used to fund a school safety plan.