Advertisement

Pitt think tank to review ethics of algorithms in local governments

The former federal prosecutor running the task force said there's a place for AI-backed technologies like facial recognition, but that governments can "do better."
algorithm flow chart
(Getty Images)

A think tank at the University of Pittsburgh this week formed a task force to review how the surrounding local governments use computer algorithms to inform policy decisions. The Pittsburgh Task Force on Public Algorithms, run out of Pitt’s Institute for Cyber Law, Policy and Security, will spend the next year considering if the algorithms the city of Pittsburgh and surrounding Allegheny County use to determine policing and housing policies are inherently biased.

Pitt Cyber director David Hickton, who will also lead the 22-member task force, told StateScoop in an interview that increased use of data has made government nominally more efficient, but that polices based off analytics and number-crunching need to be checked so that they reduce, rather than reinforce, biases against groups like people of color, women and immigrants.

“Whether you’re talking about screening children at risk or preventative policing or any other means of services to citizens, the advent of big data has allowed for more efficient process of decision making,” Hickton said. But, he added, “we’re flying the plane as we build it and we need to make sure the data is good and the algorithms are good.”

The task force’s membership is made up of Pitt professors, academics from other universities, nonprofit executives and civil-liberties activists, while representatives of Pittsburgh and Allegheny County serve as an advisory board for its research. But Hickton, who served as the U.S. attorney for Western Pennsylvania, said he thinks the city and county have a better record than other local governments on their use of algorithms.

Advertisement

He praised the Allegheny County Department of Human Services for its use of a tool that uses an algorithm to screen calls about possible child abuse or neglect. The tool scores calls on a scale of 1 to 20 — with 20 being the highest risk for neglect or abuse — to help the department assign case workers to households where children are potentially being mistreated. While the Allegheny Family Screening Tool was criticized upon its introduction in 2016 for its potential to subject low-income households to higher levels of government scrutiny, a third-party evaluation last year found that it reduced racial disparities in the county’s child-welfare system.

“Their leadership in that area is outstanding,” Hickton said.

The Pitt algorithm task force comes two months after the conclusion of a similar task force impaneled by New York Mayor Bill de Blasio to study the algorithms his city uses in forming police and housing policies. While that city-led task force ultimately led to the creation of a still-unfilled algorithm policy officer, its work sometimes generated acrimonious debate, with members arguing over the definition of algorithms, as well as taking nearly an entire year before holding their first public meeting.

The Pitt task force does not plan to wait as long to meet publicly, with its first two open meetings scheduled for March.

Along with Allegheny County’s child-welfare tool and the city of Pittsburgh’s policing algorithms, Hickton said the task force may also take a look at facial-recognition technology, which is a frequent target of groups like the American Civil Liberties Union and has been banned by growing number of cities, most recently Cambridge, Massachusetts.

Advertisement

Hickton, the former federal prosecutor, said he’s not a facial-recognition naysayer. While he praised the technology’s potential efficiency in airport screenings or security applications like logging into bank accounts, he said it frequently misidentifies women, racial minorities and non-U.S. citizens.

“Facial recognition is great,” he said. “The problem is the database is disproportionately white, male and American, so we have a high failure rate.”

He referenced a 2018 test that the ACLU ran on Rekognition, a facial-recognition platform sold by Amazon, that scanned photos of all 435 members of the House of Representatives and incorrectly matched 28 of them with people who had been arrested for various crimes.

One of the task force’s roles then, Hickton said, will be to find if the technologies believed to make government more efficient can do so while also making it more fair.

“There’s no doubt the digital space is efficient and costs less money,” he said. “But efficient government and reinforcing inequity is a false choice. We can do better.”

Latest Podcasts