As ‘predictive policing’ data tools spread nationwide, civil rights advocates sound the alarm
A new report shows that many of the nation’s largest police departments are adopting “predictive policing” software, but researchers and civil rights advocates claim the systems often prove ineffective and reinforce existing biases among law enforcement officers.
The technology policy firm Upturn released a report Wednesday showing that at least 20 of the nation’s 50 largest police forces have used a data analytics tool to attempt to predict where crimes will occur or who will commit them, and at least 11 more have actively explored adopting predictive policing systems.
Yet Upturn’s researchers found scant evidence that those systems actually managed to help police reduce crime in those areas, instead leading officers to focus on minority communities that they already come into contact with at disproportionately rates.
That prompted 17 civil rights organizations — including the Leadership Conference on Civil and Human Rights, the American Civil Liberties Union, and the NAACP — to issue a joint statement condemning predictive policing systems as tools that “threaten to provide a misleading and undeserved imprimatur of impartiality for an institution that desperately needs fundamental change.”
“This is fortune teller policing that uses deeply biased and flawed data and relies on vendors that shroud their products in secrecy,” Wade Henderson, the leadership conference’s president and CEO, said during a call with reporters. “They supercharge discrimination, profiling and the over-policing of certain communities, particularly those of people of color.”
In all, Upturn’s researchers found that police in cities from Los Angeles to Chicago have used predictive policing systems — many are “place based,” aimed at understanding where crimes are most likely to occur, while others are “person based,” which are designed to calculate how likely individual people are to commit crimes.
By working with software vendors and criminal justice researchers, as well as analyzing publicly available police contracts, Upturn was able to identify 10 companies selling predictive policing tools around the country. Of those companies, most relied on historical crime data to generate results, while others pulled in geographic data and even social media information.
“Bringing corporate interests into the criminal justice system will only yield bad results,” said Rashad Robinson, executive director of Color of Change.
[Read more: White House launches new push to get communities using data to cut down on incarceration]
The researchers noted that the mere fact of relying crime data for analysis can produce misleading results for police. After all, crime data is “greatly influenced by what crimes citizens choose to report, the places police are sent on patrol, and how police decide to respond to the situations they encounter,” the researchers wrote.
“Those forecasts are only as good as the data they’re based on,” said David Robinson, a principal at Upturn.
Ezekiel Edwards — director of the Criminal Law Reform Project at the ACLU — added that many departments fail to even collect data accurately, pointing to his group’s own struggles to get accurate information from police departments on how many people they’ve killed or how many “stop and frisk” searches they’ve conducted.
“Despite police excitement around using big data, many police departments struggle to collect data in a comprehensive and transparent manner,” Edwards said.
Indeed, the researchers and advocates charge that a department’s own enforcement priorities (which “can vary widely from one neighborhood to another”) could even skew that data, “leading to a cycle of self-fulfilling prophecies” as officers continually target areas that they’ve already designated for increased attention.
“Whenever departments focus their attention on a particular place or group of people, police will detect and respond to more of the crime that happens in those places, or among those people, than they will detect or respond to other crime,” the researchers wrote. “Even when police have a good reason to focus their attention, such as a particular neighborhood struggling with a violence problem, that focus will nonetheless distort the relationship between police statistics and true levels of crime.”
But the researchers were also interested in examining the efficacy of these tools, they found that “independent research has yet to find any benefit for community safety.” For example, Upturn points to the RAND Corporation’s study of Chicago’s person-based “Strategic Subject List,” which found that the tool failed to reduce gun violence at all.
Instead, in Chicago and elsewhere, the researchers found a variety of studies showing that “rather than changing their tactics, police who use predictive tools have tended to focus on generating more citations and arrests.”
Yet Upturn also found that police departments largely failed to use the tools for the sort of early intervention practices — to reach people with mental illnesses and struggling officers alike — that could help the systems do the most good.
“One big difference between those applications, which may be helpful, and the predictive policing we’re seeing today is the nature of the intervention,” Robinson said. “If we make a mistake about someone’s mental health and accidentally offer them clinical services that they may not need, that’s one kind of harm, but if the kind of predictive policing system we see today makes a mistake, that could lead to an arrest that knocks someone’s life off track.”
But Robinson did note that Charlotte-Mecklenburg stands as an excellent example of how a department could use predictive systems to analyze its own officers — the department worked with researchers at the University of Chicago to study which officers were most likely to have an “adverse interaction” with someone over the next two years, and proactively reached out to provide counseling to those officers.
The department managed to use that model to outperform its existing model for officer intervention, and cut down on negative interactions with the public in the process, the researchers found. Yet Robinson lamented that Charlotte-Mecklenburg is an outlier, and he’s skeptical departments will ever embrace the use of these systems for their beneficial purposes while leaving the potentially discriminatory elements behind.
“I think right now, the odds look pretty long, because we’re really stuck in a situation where there’s a lot of forces pushing toward the reinforcement of these structural biases,” Robinson said.
But the researchers and advocates charge that more transparency and public debate around the use of these systems could prompt some major changes, since they’re currently shrouded in secrecy.
Upturn noted that Chicago was the lone city the researchers studied that publicly posted a policy detailing how the department uses these tools, and found that “an open public debate regarding a police department’s potential adoption of a predictive policing system seems to be the exception to the rule.”
“The handful of exceptional cases in which these tools have occasioned public debate point strongly to the need for more transparency and a clearer view of how these tools work,” the researchers wrote.