Report: U.S. Police Are Abusing Facial Recognition Technology

David Grossman
Photo credit: Public records obtained by ACLU Oregon & Northern California.

From Popular Mechanics

Facial recognition software is increasingly becoming common in police departments around the globe. But a new scathing report from Georgetown Law’s Center on Privacy and Technology (CPT) discusses how the law enforcement across the country uses-and consistently abuses–facial recognition.

The CPT's report, titled "Garbage In, Garbage Out," details how police departments across the country are feeding facial recognition software flawed data. When looking for a suspect, police will feed an algorithm pictures of celebrities who they think share physical features with the suspect or composite sketches.

"On multiple occasions, when blurry or flawed photos of suspects have failed to turn up good leads, analysts have instead picked a celebrity they thought looked like the suspect, then run the celebrity’s photo through their automated face recognition system looking for a lead," reads a report summary.

After noticing one suspect's resemblance to the actor Woody Harrelson, for example, one New York City police officer fed the system pictures of Harrelson that he had found on Google. Using matches for Harrelson, investigating officers were later able to make an arrest. Other times, police departments across the country have fed algorithms composite sketches.

"The stakes are too high in criminal investigations to rely on unreliable-or wrong-inputs," warns the report. "It is one thing for a company to build a face recognition system designed to help individuals find their celebrity doppelgänger or painting lookalike for entertainment purposes. It's quite another to use these techniques to identify criminal suspects, who may be deprived of their liberty and ultimately prosecuted based on the match. Unfortunately, police departments' reliance on questionable probe photos appears all too common."

Although not universal, these tactics are widespread. State and local police departments in Maryland, Virginia, Florida, Oregon and Arizona all confirmed to the CPT that sketches could be submitted to their face recognition systems. In Arizona, the Maricopa County Sheriff’s Office gives officers a brochure, which states that "suspect sketches and even forensic busts" can be used with facial recognition software.

The ability of these systems to use composite sketches is a selling point made by market leaders like Amazon, the German-based Cognitec, and the California-based Vigilant Solutions.

"Vigilant’s tools help enable agencies to edit the images for better-quality probe images, including creating a proxy image from a sketch artist or artist rendering," reads the company's website.

Photo credit: Klare, Li, & Jain

However, the report points to multiple studies in recent years, from the Los Angeles Police Department to the National Institute of Standards and Technology (NIST), that argue composite sketches don't work. "Sketch searches mostly fail," reads the NIST report.

The NYPD also edits photos to make them fit into the algorithms. The Police Department uses Photoshop techniques that would be familiar to even casual users of the software, like using the "Blur effect" on an overexposed or low-quality image or the clone stamp tool to correct poor images. At times, the NYPD and other organizations will use "3D modeling software to complete partial faces and to 'normalize' or rotate faces that are turned away from the camera."

The NYPD, for its part, does not deny using these tools.

“The NYPD constantly reassesses our existing procedures and in line with that are in the process of reviewing our existent facial recognition protocols,” Detective Denise Moroney said in a statement to The Verge. “No one has ever been arrested on the basis of a facial recognition match alone. As with any lead, further investigation is always needed to develop probable cause to arrest. The NYPD has been deliberate and responsible in its use of facial recognition technology."

There's no way to determine the commonality of the tactics, states the CPT report. Data about how facial recognition software is used within law enforcement is intermittent, and police do not have to share how their software reached its conclusions.

"Even though prosecutors are required under federal law to disclose any evidence that may exonerate the accused, defense attorneys are not typically provided with information about 'virtual probes,' celebrity doppelgängers, or really any information about the role face recognition played in identifying their client," the report states.

However, the CPT states that in all the likelihood, "the problem will get a lot bigger." The FBI, whose facial recognition program has been noted for its flaws, is committed to expanding its own program over the next two to three years.

"In setting this goal, the FBI has assumed that the results of face recognition systems will become more accurate as the algorithms improve," the report states. "But these improvements won’t matter much if there are no standards governing what police departments can feed into these systems."

Source: The Verge

('You Might Also Like',)