As facial recognition databases become a more common tool for law enforcement, a Florida court could make a landmark ruling in the use of the technology.
A case now being decided by Florida First District Court of Appeal is the first of its kind, according to researchers from the Georgetown Center on Privacy and Technology, and could set a precedent for how law enforcement is permitted to use or not use facial recognition in investigations going forward.
The case centers around the alleged use by the Jacksonville Sheriff’s Office of a facial recognition system to identify Willie Allen Lynch, a man who was convicted of a $50 crack cocaine sale in May 2016. Lynch’s attorneys are demanding that law enforcement turn over photos of other matches, who are not the defendant, that could potentially exonerate Lynch under a rule established in 1963 called a Brady disclosure.
Appellate judges will decide whether the state is required to turn over the material and whether their identification process met legal standards for conviction.
State prosecutors are defending law enforcement’s use of the technology and claim that the facial recognition system was not a factor in Lynch’s arrest and subsequent conviction.
Celbrica Tenah, an analyst with the Sheriff’s Office Crime Analysis Unit, explained in her deposition how the technology works, but had trouble elaborating when pressed for details.
“It does arrange the photos based on likeliness,” Tenah said. “I can’t speak to the algorithms … but it does from my understanding arrange the photos based on what’s most likely to the photo that you uploaded.”
Some facial recognition systems used by law enforcement, like the one used by the Jacksonville Sheriff’s Office, compare a photo uploaded to a mugshot database of known identities supplemented by the state’s driver’s license data.
Ungoverned police access to such systems has raised eyebrows among at least a few civic rights advocates. Some liken the use of a facial recognition database that relies on photos of innocent people without their explicit consent to a virtual police lineup.
Repeated studies and court room anecdotes have also shown that people have a harder time identifying people who are of an ethnic makeup different than themselves. Because software engineers who develop facial-recognition software are mostly non-black, it’s argued that their implicit biases have seeped into facial recognition algorithms that are now unfairly jilting non-white criminal suspects.
If the Florida case ruling comes down on the side of the defendant, it could set a precedent for future use of facial recognition by law enforcement and encourage the creation of new laws that establish new oversight bodies or require policies for proper ethical use of the technology. Similar trends have been seen in recent history with other emerging technologies such as autonomous vehicles.
Defense attorneys countered the prosecution’s response to their appeal in November and the case is now awaiting a decision that is expected to take several more months.