Advertisement

As facial recognition takes off with police, laws governing its use remain conspicuously absent

A new system from Amazon is a hit with the Washington County Sheriff's Office, but civil rights advocates say stronger governance is overdue.

With growing adoption of facial recognition systems, the phrase “Have you seen this man?” may soon disappear from law enforcement’s vernacular. It’s a shift that comes with immense potential for police forces — and also many unresolved concerns for the public.

Not only is the technology getting better, it’s getting cheaper, making it accessible to departments that might not have the nation’s biggest IT budgets. The Washington County Sheriff’s Office in Oregon is among those finding that the return on a small investment can be considerable: Since the county near Portland launched its system this summer, it has become a regular tool for officers and has led to detainment of suspects that would have likely otherwise remained at large, an analyst for the office told StateScoop.

In Washington County, which serves a population of about 530,000 people, officers use the system about five times a day, said Chris Adzima, a senior systems analyst with the department. Before building out an early version of the system in January, the department had to resort to methods that date back decades — asking around town or asking nearby law enforcement agencies if a certain visage captured by a distant camera looks familiar. That’s changed, Adzima said.

“Now, we have a full system that we can run against all 300,000 different people who have been through the jail in the past 10 years,” Adzima said.

Advertisement

The growth in usage is coming at a social cost that is difficult to measure. Though law enforcement agencies like the Washington County Sheriff’s Office typically characterize facial recognition software as an acceptable continuation of traditional practices, the speed and the power of the technology raise questions about fairness, accuracy, potential privacy violations and the need for general oversight by the public.

These concerns persist in the national debate despite a 2016 landmark study that has not managed to spur the federal policy and oversight updates that civil rights advocates have been seeking for years.

But as more law enforcement agencies embrace the technology and watchdogs sort out its implications, at least one thing is clear: Police are using the technology because it works.

From unknown to confirmed

The Washington County office uses Amazon’s Rekognition, an image and video analysis engine, which draws from a repository of mugshots uploaded to Amazon’s cloud. After feeding in an image of a suspect, a result is instantly returned that confirms or denies whether the person has been booked in their jail within the past decade.

Advertisement

Adzima said his favorite example of the technology was the first lead it generated earlier this year. The story goes that a man entered a hardware store, filled his cart with thousands of dollars of tools, pantomimed checking out in the self-checkout line, and then walked out of the store without paying.

“Unbeknownst to him, there’s a camera in each self-checkout kiosk, so we got a really good picture of his face,” Adzima said. “We ran that through Rekogition, got a result and the result we got back looked very similar to him.”

Adzima said he forwarded the lead to a detective, who then continued with his usual research, which included a quick Facebook search.

“His public profile picture was wearing the exact same hoodie as the person who stole the items from the hardware store,” Adzima said. “So it went from somebody we didn’t know and had no idea who he was to a pretty much confirmed ID within a couple of minutes. And they were able to go out and find that person and were able to detain him with some of the property he stole.”

Facial recognition and advances in camera technology are a boon for law enforcement. In years past, a fuzzy image from a distant surveillance camera would likely not have provided enough information for police to identify the suspect — unless he happened to be a regular offender in the area, Adzima said.

Advertisement

Case resolutions like this one will become more common if police adoption of facial recognition continues to grow. A 2016 study published by the Center on Privacy and Technology at Georgetown Law shows that half of American adults — more than 117 million people — are in a law enforcement agency’s facial recognition database.

While technology companies that provide facial recognition systems typically claim accuracy rates of 95 percent or greater, law enforcement offices typically verify positive hits with human eyes. However, some research shows that human users are unable to make accurate assessments without specialized training, which civil rights advocates say can lead to unfair profiling.

The technology’s low cost and relative ease of implementation could also drive increased adoption in the coming years.

Adzima, who said he had not worked with Amazon Web Services before, said he was able to lay the groundwork for the system in three days, and then spent the next month uploading mugshots with metadata and ensuring they were properly indexed in the system. He said he was able to do searches right away, and then the following months were spent refining the system and offering it to the department at large so officers wouldn’t have to go through him to conduct searches. Similar facial recognition systems for police are offered by companies like Facefirst, Vigilant Solutions and NEC.

And it’s so inexpensive, Adzima said, that he often has to reassure whoever he’s talking to that there aren’t any other extra costs. Initial setup, which included uploading about 310,000 mugshots to Amazon’s S3 cloud, cost about $700, he said, plus a monthly $12 fee.

Advertisement

“It’s definitely changed the thought process around utilizing cloud services in the Sherriff’s Office,” Adzima said. “Prior to this we didn’t have any presence with AWS and now we’re thinking about ways to utilize other cloud services in general to help get tools to the deputies and detectives.”

Privacy on the back end

Facial recognition has a lot of law enforcement offices thinking about how to do police work differently. The Arizona Department of Public Safety is ramping up its use of facial recognition through connection with a database of driver’s license photos — a more controversial use of the technology. Several large metropolitan police departments around the country — including the NYPD and LAPD — have also been using facial recognition systems, in some cases for years before the public even learned about it. For real-time facial recognition systems in the United Kingdom, police hold more than 20 million images of members of the public, many of whom have never been charged with a crime.

The FBI’s Criminal Justice Information Services division has maintained its own system called the Next Generation Identification Interstate Photo System since 2011. The program’s Facial Analysis, Comparison, and Evaluation (FACE) system contains tens of millions of photos that can be accessed by law enforcement offices in 11 states. Precisely what subset of photos are contained in this database is unclear.

In 2016, the Government Accountability Office reported that the FBI was negotiating with 18 additional states and Washington D.C. to access their driver’s license photos, but the report was republished, with all references to the 18 states removed and a statement that there were “no negotiations underway.”

Advertisement

As in Washington County, the FBI says any result from its system is “only used as an investigative lead, and not as a means of positive identification.”

The FBI’s increased use of the technology prompted a full House oversight committee hearing in March in which Republican Rep. John Duncan said, “I think we’re reaching a very sad point, a very dangerous point, when we’re doing away with the reasonable expectation of privacy about anything.”

Georgetown University researcher Clare Garvie says facial recognition systems provide an “imperfect biometric” and are laden with unresolved social problems.

Concerns about how the data in the systems is collected and used is frequently undisclosed and without standardized independent oversight. The Georgetown Law research found that only nine of 52 agencies studied reported that they log and audit searches for improper use.

Some civil rights advocates equate the use of facial recognition systems to that of a digital police line-up, except that the innocent faces in the line-up didn’t agree to participate. Others, like U.S. Rep. Elijah Cummings of Maryland — the top Democrat on the House Oversight Committee — say the technology disproportionately affects ethnic minorities.

Advertisement

“If you’re black, you’re more likely to be subjected to this technology and the technology is more likely to be wrong,” Cummings said at a congressional hearing in March. “That’s a hell of a combination.”

Research from the FBI supports the idea that some demographic groups “are more susceptible to errors in the face matching process.” MIT Media Lab researcher Joy Buolamwini says this could be in part because the computer scientists writing facial recognition software are disproportionately non-black. A 2011 study by the National Institute of Standards and Technology provides evidence to support the claim that “humans recognize faces of their own race more accurately than faces of other races.” Some say that phenomenon is unintentionally transferred to software that is designed chiefly by non-blacks.

Recognizing that the technology is imperfect, Adzima said he doesn’t see anything wrong with how police are using the technology in Washington County.

“All of the data is public record,” Adzima said — the software is simply a tool to reduce the time spent manually processing images that both officers and the public can already access, he said.

Adzima added that it “doesn’t cause harm” if a bad lead is generated. Matches from the system are investigated with due diligence in the same way as other leads, he said — if someone calls the department claiming that a person had committed a crime, officers would simply investigate the claim, and the leads generated by this software are treated similarly.

Advertisement

While the Georgetown study prompted changes in some places, like Vermont, where the state attorney general put a moratorium on the technology while it investigated how it was being used, federal guidelines have not materialized. In fact, said Neema Singh Guliani, legislative council for the American Civil Liberties Union, use by agencies like the Department of Homeland Security (DHS) and the General Services Administration are on the rise.

In November, DHS called on industry to submit proposals for technology that would maintain a facial recognition database of everyone in a vehicle who passes through a U.S. border crossing. While the agency’s procurement documents specify for technology that anonymizes the identities of travelers who are “not in-scope,” groups like the ACLU are not satisfied with the government’s implement-first, ask-questions-later approach.

“We don’t have public guidelines in many places. We don’t have legislation dictating when these searches can be run,” Guliani said. “You essentially have a situation where police have rolled out a new technology and they’ve done so not considering the privacy on the front end. That’s really a backward way of doing things.”

Latest Podcasts