Advertisement

To be fair, New York City assembles algorithm task force

The panel will look at whether the city's automated decision systems are racially biased.

New York Mayor Bill de Blasio announced on Wednesday the members of the city’s newly created Automated Decision System Task Force, a panel drawn up to examine whether the algorithms the city uses to make decisions result in racially biased policies. Emily Newman, the acting director of the mayor’s operations office, and Brittny Saunders, a deputy human rights commissioner, will co-chair the 16-member body.

The unveiling of task force’s roster comes about six months after the New York City Council voted to study municipal agencies’ automated decision-making systems for algorithmic bias. The city uses algorithms for numerous functions, including predicting where crimes will occur, scheduling building inspections, and placing students in public schools.

But algorithmic decision-making has been deeply scrutinized in recent years as it’s become more commonplace in local government, especially with respect to policing. The legislation that prompted the new task force came about after then-Councilman James Vacca read a ProPublica investigation that revealed law-enforcement algorithms — used in New York and elsewhere — routinely assign higher risks for recidivism to black people than to whites.

In a press release, de Blasio’s office says the new task force will be the first of its kind in the United States.

Advertisement

“Whether the city has made a decision about school placements, criminal justice, or the provision of social services, this unprecedented legislation gets us one step closer to making algorithms accountable, transparent, and free of potential bias,” City Council Speaker Corey Johnson said in the release.

After Newman and Saunders, the task force’s other 14 members include college professors and nonprofit executives.

The task force is charged with the “development and implementation of a procedure that may be used by the city to determine whether an agency automated decision system disproportionately impacts persons based upon age, race, creed, color, religion, national origin, gender, disability, marital status, partnership status, caregiver status, sexual orientation, alienage or citizenship status.” 

Vacca’s original bill attempted to go much further by requiring city agencies that make decisions based on algorithms to publish their code, but the final bill was criticized for having rather relaxed disclosure requirements. Instead of compelling city agencies to turn over information related to their algorithms, the task force will only be able to use what agencies hand over voluntarily, the New Yorker reported last December.

The task force is scheduled to hold its first meeting next month and will convene periodically over the next year-and-a-half, said Erin Kuller, a spokeswoman for the mayor’s operations office. While the task force will be “working closely” with several agencies — including the New York Police Department, Mayor’s Office of Criminal Justice, and the departments of education, social services, and transportation — the de Blasio administration did not say how the task force will ensure those agencies comply with its requests for information about their algorithms.

Advertisement

The task force has until December 2019 to produce its report, which will be made public, Kuller said.

Latest Podcasts