Police have been feeding celebrity photos, doctored images into facial-recognition systems

New research shows that a lack of clear standards for using the technology has led many agencies to resort to an array of questionable practices in attempting to identify criminal suspects.
partial face guy
Getty Images

Law enforcement agencies around the country have been feeding artist sketches, distorted images and photos of celebrities into their facial-recognition systems to generate matches while searching for criminal suspects, according to new research released Thursday.

Facial recognition tools, which the city of San Francisco banned police and other government agencies from using on Tuesday, have grown increasingly common in airports and law enforcement agencies in recent years, but remain mostly unregulated, according to a report by the Georgetown Law Center on Privacy & Technology, which argues that this shortage of rules has allowed law enforcement to abuse these tools.

The report also calls for a moratorium on police use of facial recognition technology until policymakers issue standards for its appropriate use.

In one instance, the New York Police Department used a photo of actor Woody Harrelson to generate a match on a suspect after a still image taken from surveillance camera footage yielded no results from the city’s facial-recognition system. An arrest was subsequently made. In another instance, the department fed a photo of an unidentified New York Knicks player into the system while searching for an assault suspect.


Other source images used by police identified in the report include photos posted to social media that had filters applied, scanned photos from albums, computer-generated images, artist sketches and composite images. The Center on Privacy & Technology says it has identified at least six law-enforcement agencies that “permit, if not encourage” using forensic sketches as source material in facial recognition searches.

“The stakes are too high in criminal investigations to rely on unreliable—or wrong—inputs,” Georgetown researcher Clare Garvie writes in the report. “It is one thing for a company to build a face recognition system designed to help individuals find their celebrity doppelgänger or painting lookalike for entertainment purposes. It’s quite another to use these techniques to identify criminal suspects, who may be deprived of their liberty and ultimately prosecuted based on the match.”

The NYPD defended its use of facial recognition systems in a statement emailed to StateScoop, noting that the technology merely generates leads and does not constitute positive identification nor probable cause for arrest.

“No one has ever been arrested on the basis of a facial recognition match alone,” NYPD spokeswoman Jessica McRorie said. “As with any lead, further investigation is always needed to develop probable cause to arrest. The NYPD has been deliberate and responsible in its use of facial recognition technology.”

But the Center on Privacy & Technology says officers in most jurisdictions where these systems are used are not given clear guidelines about what kind of additional evidence is needed to corroborate a match. The NYPD placed one suspect in a criminal line-up purely on the basis of a facial-recognition match, the report notes. A Metropolitan Police Department officer in Washington, D.C., presented a “possible match” photograph from the department’s facial-recognition system to a witness to confirm identity in one case.


There are other examples. The NYPD made 2,878 arrests following facial-recognition searches in the first five-and-a-half years of using the technology, according to NYPD records.

Beyond a lack of clear or unified governance, researchers object in the report to questionable use of the technology as a starting point for an investigation.

In some cases, investigators pasted facial features clipped from stock images, such as lips or eyes, onto suspects’ photos. Other techniques include using 3-D modeling software to fix flaws in original images, mirroring a partial face when the entire face was not visible, using a blur effect on a low-quality image, or using the “clone stamp tool” to “create a left cheek and the entire chin area” in one case when part of a suspect’s face was obscured in the source image.

“These techniques amount to the fabrication of facial identity points: at best an attempt to create information that isn’t there in the first place and at worst introducing evidence that matches someone other than the person being searched for,” the report states.

The NYPD maintains the technology has been valuable, pointing to the recent arrest of a man who was throwing urine at train conductors and another who pushed a subway passenger onto the tracks.


“The leads generated have also led to arrests for homicides, rapes and robberies,” McRorie said. “The NYPD has also used facial recognition for non-criminal investigations, for example a woman hospitalized with Alzheimer’s was identified through an old arrest photo for driving without a license.”

But researchers are not doubting that the technology can work, but rather that it is often being used in a way that ensures it won’t, necessitating a temporary halt on its use by police.

“In the absence of [standards governing what police departments can feed into these systems], we believe that a moratorium on local, state, and federal law enforcement use of face recognition is appropriate and necessary,” the report concludes.

Latest Podcasts