NYC oversight hearing exposes gaps in agencies’ use of AI, surveillance tools
During a New York City Council oversight hearing on Monday, a representative from the city’s technology office struggled to answer questions about how agencies use artificial intelligence, biometric data and other surveillance tools.
Lawmakers are considering two local bills that would ban biometric data collection in businesses and residential buildings citywide, and though both bills largely exempt government use, they pressed the agency representatives to explain how such tools are deployed across the New York City government.
During his testimony, Alex Foard, assistant commissioner of research and collaboration at the Office of Technology and Innovation, lacked answers to several key questions from the council, sometimes deferring to other offices. Foard said, for instance, that his office doesn’t track all of the city’s biometric data collection, only tools reported under Local Law 35, a 2022 law that requires city agencies to annually disclose their use of algorithmic tools that can impact the public, like facial recognition software.
Technology Committee Chair Carmen De La Rosa, who led Monday’s proceedings, said the lack of transparency from OTI is a pattern of behavior that can be traced back to former Mayor Eric Adams. She pointed out that the council’s previous two requests for information about agency use of these technologies, last June and December, went unanswered.
And while Foard said OTI does not maintain a comprehensive list of all agencies collecting biometric data (he said he’d have to “take that back” to the agency’s chief privacy officer for a complete answer), he did name three tools used by city agencies that rely on biometric data, disclosed in the 2024 Local Law 35 report. These were the New York City Police Department’s use of facial recognition technology in investigations, the Department of Investigation‘s use of facial recognition in internal investigations; and the Office of Chief Medical Examiner’s use of biometric data in its DNA database.
Foard said that even the reporting required by Local Law 35 likely doesn’t accurately capture all the ways New York City agencies use biometric data, but that some uses would require disclosure under another city transparency law.
“I do want to indicate that agencies could be using biometric data in ways that aren’t involved in algorithmic decision making or AI or other uses, in which case we would not have visibility into that collection,” Foard said. “That said, the collection of identifying information and biometric data would be covered under the Identifying Information Law. Biometric data are considered identifying information, so any use that agencies are using to collect or or maintain those data would be governed by that law.”
Foard also said each agency decides for itself whether its use of a technology meets the reporting criteria under the local law, adding that “there could be instances where we talk to an agency and they make a determination [that] it doesn’t seem to quite fit.” He also said that OTI cannot force agencies to report or change the tools they use, but that the agency instead coordinates the reporting efforts and works with the agencies on an advisory basis, providing them with both written guidance and one‑on‑one guidance as needed around using emerging technologies like AI.
But councilmembers were sharply critical of OTI for the agency’s lack of information, enforcement power and strategy regarding the oversight of city government using these tools, as well as the agency’s overly “neutral” or advisory posture, given the civil rights implications of government using these technologies. (OTI declined to comment further for this story.)
Chair De La Rosa did concede some leniency due to the agency’s recent leadership change — Mayor Zohran Mamdani last month named Lisa Gelobter, an Obama-era Education Department appointee, to serve as his administration’s chief technology officer and leader of OTI — but she said it did not excuse what she considered a lack of clear answers about citywide technology policies on biometric data and AI.
“Listen, I think that I understand that you all have just been appointed a new chief technology officer,” De La Rosa said. “Today’s her first day, and I appreciate y’all coming and being here and testifying, but I want to set the expectation that this committee is going to ask you all about citywide positions on things, because OTI has a directive to sort of be the clearing house for how technology is used across the city, and what we have seen here today is the inability for OTI to answer very basic questions that are policy positions that an agency should have clear.”
During her comments on Monday, Councilmember Shahana Hanif argued that OTI’s neutral posture around AI — particularly facial recognition software, and its potential reinforcement of racial biases following studies demonstrating the technology’s high false-positive rates for those from marginalized groups — is “dangerous.”
“Knowing that technology performs unequally across race and gender, should the city view that as discriminatory?” she said. “What I’m trying to get at is that it seems as though … the city right now does it does not have a good grasp of biometric technology, and I think taking a neutral position is quite dangerous for our city, particularly because the field of biometrics is only growing, and I take a lot of concern knowing that the NYPD is using this data.”
Following testimonies from OTI’s Foard and Lucy Joffe, the Department of Housing Preservation and Development’s deputy commissioner for policy and strategy — who was straightforward about HPD’s lack of expertise on biometrics, lack of enforcement ability and limited visibility into private landlords’ biometric use — De La Rosa said the council’s technology committee would send a letter after the hearing asking OTI for specific datasets and information on city agency use of these technologies. She added that “if there are things you can give us, give them to us, and if there aren’t things that we can give them, then we could have that conversation on the side.”