Texas resurrects aerial drone program for emergency response
February 16, 2018
After canceling the program in 2010, the state's public safety agency has found advances in the technology have presented a practical investment worth revisiting.
The state attorney general says a program used by the Department of Motor Vehicles for fraud detection is illegal, but the national market presses onward.
Colin Wood is the managing editor of StateScoop. Before that, he was a staff writer for Government Technology magazine. Before that, he taught Engl...
As governments search for new means identity management for citizens, a decision in Vermont has created a setback for facial recognition software.
Attorney General TJ Donovan ruled Tuesday that a program operated by the state’s Department of Motor Vehicles is illegal. Donovan’s office said the program, which has been used to scan and collect thousands of driver’s license images for the purpose of fraud detection since the program’s launch in 2012, does not comply with a 2004 law called Act 154.
The decision was praised by civil rights advocates, but it came during a week in which two companies announced a partnership to bring improved facial recognition capabilities to jurisdictions still permitted to use the technology amid emerging legal and ethical standards.
The preceding Vermont law states that the department "shall not implement any procedures or processes for identifying applicants … that involve the use of biometric identifiers." The program had been halted in May pending a decision from the AG.
The American Civil Liberties Union generally opposes government programs that include the use of biometric technologies like facial recognition and fingerprinting, including the Vermont DMV’s, which the group called “patently illegal” and suggested it was secretly being used for purposes other than fraud detection.
The executive director of ACLU of Vermont, James Lyall, said in a news release that the program “invades Vermonters’ privacy, disproportionately targets people of color, places immigrants at increased risk of harm, and lacks due process protections to prevent further abuse.”
Vermont Motor Vehicles Commissioner Robert Ide told the media he had been in contact with the AG’s office, was not surprised by the decision and would pursue other methods of fraud detection.
As government leaders cite interest in new capabilities and automation that artificial intelligence technologies like facial recognition promise, civil rights advocates have warned for years of potential negative implications or the realization of a “police state” if government-led biometric programs are left unchecked.
A 2016 report from Georgetown Law’s Center for Privacy and Technology found that nearly half of all Americans are included in at least one police facial recognition database. The report reveals a mixed, but largely unregulated, legal landscape in which 16 states explicitly allow the use of facial recognition and more than 25 percent of state and local police departments nationally have the opportunity to run facial recognition searches through their own or another agency’s system.
The Vermont ruling comes as Motorola Solutions and artificial intelligence company Neurala announced a partnership with the goal of enabling capabilities for police equipped with body cameras, like automated searching of missing children and suspects, according to a company press release. The new capabilities are intended for integration with the company's Si500 camera.
Motorola Solutions Chief Technology Officer Paul Steinberg explained that the technology could potentially create safer communities.
Neurala's capabilities, which include device-level analysis of images and machine learning, "will help us explore solutions for a variety of public safety workflows such as finding a missing child or investigating an object of interest, such as a bicycle,” Steinberg said.
In an email to StateScoop, a spokesperson explained that the technology to be enabled by the new partnership is not facial recognition, but an object recognition capability that would allow law enforcement to easily identify a person with brown hair or a red shirt, for example.
It's not clear whether facial recognition will become a common feature included in body cameras. Axon, formerly known as Taser, is the predominant force in the police body camera market and does not yet include the technology. In April, CEO Rick Smith told Quartz it's a complex issue, citing a need to balance the utility of the technology with transparency and the ability to audit its use.
In February, Axon acquired two machine learning companies, Dextro and Misfit. The company's website explains that the technology will be used for "eliminating paperwork and automating tedious back-office workflows such as redaction." The camera maker also notes its responsibility in using AI "for the public good" as it forms an AI Ethics Board to craft ethical guidelines that keep in mind issues of privacy and public trust.
While humanity’s biological evolution has left the brain adept at immediately discerning between the subtle differences found in countless faces, computer software still struggles with varying lighting, cameras angles, and users frequently report false positives generated from the 2-dimensional images used in their databases. The advent of facial recognition and the continued proliferation of public surveillance cameras has led to several efforts to thwart the emerging technology, including a camouflage technique called CV Dazzle that involves wearing dramatic and colorful makeup and hair to change one’s usual appearance and slip into the crowd undetected.
This story was updated on July 20, 2017 to clarify that the company involved in the new partnership is Motorola Solutions, not Motorola. More information was also added to clarify that the type of technology enabled by the partnership is object recognition, not facial recognition.