A recent study published by Clare Garvie from Georgetown Law Center on Privacy and Technology reveals that the police and law enforcement agencies were feeding celebrity images, composite and digitally rendered sketches, and pixelated close-circuit images to their facial recognition system to solve crimes. The study came out a few days following San Francisco’s vote to ban the police and city agencies from using facial recognition systems.
The study pointed out that there are little to no rules that govern how the police and other law enforcement agencies can use facial recognition software especially when it comes to what image they can submit to the system to generate investigative leads. As a result, they can submit all manner of “probe photos” of unknown individuals provided for search against a police or driver license database. Consequently, the police can – and do – submit low quality camera stills, screenshots from social media, filtered selfies, and scanned photo album photos.
As an example, the researcher pointed out to the case of a man who was accused of stealing from a CVS in New York City. While the store’s surveillance system captured the obstructed and pixelated photo of the suspect, the facial recognition system used by the NYPD was not able to generate a match. As creative as the NYPD can get, they noticed that the suspect “kinda” looked like the actor, Woody Harrelson, and fed his image in the system. This celebrity “match” was sent back to the investigating officers, and someone who was not Woody Harrelson was eventually arrested for petit larceny.
According to the study, Woody Harrelson was not the only celebrity whose photo was used to identify his criminal doppelganger. Facial Identification (FIS) of the NYPD also used a photo of New York Knicks player to search its facial recognition database for a man with an assault charge in Brooklyn.
“The stakes are too high in criminal investigations to rely on unreliable—or wrong—inputs. It is one thing for a company to build a face recognition system designed to help individuals find their celebrity doppelgänger or painting lookalike for entertainment purposes. It’s quite another to use these techniques to identify criminal suspects, who may be deprived of their liberty and ultimately prosecuted based on the match. Unfortunately, police departments’ reliance on questionable probe photos appears all too common,” Garvie wrote in her study.
Meanwhile, Garvie also notes that the police is not only using photos of other people – celebrities at that – to match someone who is not them. One of the most questionable practice, the researcher noted, was the police are using composite sketches in generating a match in the database.
“At least half a dozen police departments across the country permit, if not encourage, the use of face recognition searches on forensic sketches,” she highlights.
In fact, the brochure from Maricopa County Sheriff’s Office in Arizona states that “the image can be from a variety of sources including police artist renderings,” and that the technology “can be used effectively in suspect identifications using photographs, surveillance still and video, suspect sketches and even forensic busts.”
The renowned privacy lawyer argues that this is a dangerous practice, and misidentification is highly probable. “The most likely outcome of using a forensic sketch as a probe photo is that the system fails to find a match—even when the suspect is in the photo database available to law enforcement. With this outcome, the system produces no useful leads, and investigating officers must go back to the drawing board,” Garvie argues.
While law enforcement agencies argue that the use of facial recognition is limited to initial identification alone and will not be considered as a positive ID, Garvie provided cases where FR systems were used to implicate innocent people solely based on facial recognition matches.
“NYPD officers made an arrest after texting a witness a single face recognition “possible match” photograph with accompanying text: “Is this the guy…?” The witness’ affirmative response to viewing the single photo and accompanying text, with no live lineup or photo array ever conducted, was the only confirmation of the possible match prior to officers making an arrest,” cited Garvie in her study.
In the end, Garvie said that even if the FBI is positive that the accuracy of facial recognition becomes better as algorithms improve, these improvements won’t matter as long as there is no regulation as to how they should be used.
“In the absence of those rules, we believe that a moratorium on local, state, and federal law enforcement use of face recognition is appropriate and necessary.”