The first round of disclosures about the use of surveillance technology by the New York Police Department under a city law adopted last year left many of the privacy-rights and civil-liberties advocates who pushed for the law’s passage unsatisfied.
In a letter to New York Police Commissioner Dermot Shea this week, the New York Civil Liberties Union wrote that the disclosures the department filed last month included about 36 technologies — including gunshot detection platforms, body-warn cameras, closed-circuit television and facial recognition — lacked “serious consideration of the potential for biased and disparate enforcement.”
Among other criticisms, the organization accused the NYPD of taking a “lazy copy-and-paste approach” that repeated language across the 36 draft policies it released, sometimes without even bothering to describe the correct system. On the disclosure for its use of unmanned aerial vehicles, the NYPD “didn’t even replace language related to body-worn cameras,” Daniel Schwarz, an NYCLU privacy and technology strategist, told StateScoop.
“The NYPD shows a complete lack of serious engagement on the bias of these technologies,” he said.
‘No way of knowing’
The disclosures were ordered under the Public Oversight of Surveillance Technology, or POST, Act, which the New York City Council approved and Mayor Bill de Blasio signed last summer. The law requires the NYPD to provide the public with a list of all “equipment, software, or systems capable of, or used or designed for, collecting, retaining, processing, or sharing audio, video, location, thermal, biometric, or similar information,” along with explanations of how those tools are used.
Schwarz said the police documents are also incomplete because they often do not identify the exact vendors the department is using. He said that’s particularly troubling with respect to facial recognition, a technology that privacy advocates have long charged with being error-prone, especially when attempting to identify women and Black and Latino people. In its disclosure, the NYPD stated that its facial-recognition tools are closely monitored by human operators.
“The safeguards and audit protocols built into this impact and use policy for facial recognition technology mitigate the risk of impartial and biased law enforcement,” the department’s disclosure reads. “NYPD facial recognition policy integrates human investigators in all phases. All possible facial recognition matches undergo a peer review by other facial recognition investigators.”
But Schwarz said that without knowing what facial-recognition software New York City is using, the NYCLU and others can’t verify that.
“Facial recognition has been shown in study after study to have disparate impacts,” he said. “Without knowing the vendor that’s used, there’s no way of knowing how the NYPD’s system would work.”
The NYCLU also took exception to the department’s filing about the Domain Awareness System, a citywide surveillance network New York developed in partnership with Microsoft in the years after 9/11. The system is connected to numerous devices throughout the city, including license plate readers and 18,000 closed-circuit cameras, along with millions of police, court and 911 records. The NYPD’s disclosure for the system claims that it does not use any machine learning tools, yet the NYCLU’s letter to Shea points to past news reports that the department is making use of those products, such as a piece of crime analytics software known as Patternizr.
“It may be they changed the capability, but [the discrepancy] shows the rejection to engage with the POST Act,” Schwarz said.
A deeper look
The POST Act was first proposed in 2017, but gained enough steam last year amid a nationwide reconsideration of policing following the killing of Minneapolis resident George Floyd by an officer there. It also followed other actions by New York City to evaluate and possibly rein in the use of surveillance technology, including an 18-month task force on the use of algorithms in executing government policies that led to the creation of a new role for an algorithms management and policy officer in de Blasio’s office. (The hiring for that position, though, was postponed after the COVID-19 pandemic hit.)
But Schwarz said the city, in particular the police department, has been slow to change.
“The time is right for looking more deeply into these systems,” he said. “With the NYPD little has changed. We will have to see what the next steps are. Banning technologies is one step” — he said, noting that several major cities have outlawed facial recognition — “but there may be other ways to strengthen the POST Act.”
Following a public comment period, the NYPD has until April 11 to issue finalized policies.