Clare Garvie (L’15) of the Center on Privacy & Technology Continues Groundbreaking Work on Police Facial Recognition

May 16, 2019

Clare Garvie (L'15), senior associate at the Center on Privacy & Technology, continues her groundbreaking work on police facial recognition practices, with new reports issued this week.

On April 28, 2017, a suspect was caught on camera reportedly stealing beer from a CVS in New York City. The store surveillance camera that recorded the incident captured the suspect’s face, but it was partially obscured and highly pixelated. When the investigating detectives submitted the photo to the New York Police Department’s (NYPD) facial recognition system, it returned no useful matches.

Rather than concluding that the suspect could not be identified using face recognition, however, the detectives got creative.

One detective from the Facial Identification Section (FIS), responsible for conducting face recognition searches for the NYPD noted that the suspect [coincidentally resembled] the actor Woody Harrelson, known for his performances in Cheers, Natural Born Killers, True Detective, and other television shows and movies. A Google image search for the actor predictably returned high-quality images, which detectives then submitted to the face recognition algorithm in place of the suspect’s photo. In the resulting list of possible candidates, the detectives identified someone they believed was a match — not to Harrelson but to the suspect whose photo had produced no possible hits.

This…“match” was sent back to the investigating officers, and [the suspect] was eventually arrested for petit larceny.

 

This astonishing story, regarding the use of an innocent celebrity’s image in a police investigation, leads off “Garbage In, Garbage Out: Face Recognition on Flawed Data” — a report released on May 16 by Georgetown Law’s Center on Privacy & Technology and its Senior Associate, Clare Garvie (L’15). A second new report, entitled “America Under Watch: Face Surveillance in the United States,” was authored by Garvie and Professor Laura Moy, the Center’s executive director; it focuses on Detroit, Chicago, and other cities. The reports were mentioned in a New York Times opinion piece on San Francisco’s decision to ban the use of facial recognition technology.

On May 22, Garvie testified on the subject of police facial recognition before the House Oversight Committee.

“I used to think that we could control face recognition through comprehensive legislation,” she says in a news release of the Center on Privacy & Technology. “That may eventually be possible, but thanks to the new information we have learned, I now believe we need to hit the pause button on its further use and advancement. San Francisco’s ban on face recognition shows us that community leaders are waking up to the danger of this technology… I will advocate for a nationwide moratorium on law enforcement use of face recognition until comprehensive rules are put in place.”

Expertise

As a 3L at Georgetown Law in 2014-2015, Garvie served as a research assistant for Professor Laura Donohue — who recommended that Garvie apply to be the first law fellow for the center, then in its first year. Less than 18 months after her 2015 graduation, Garvie’s name was at the top of “The Perpetual Lineup: Unregulated Police Face regulation in America,” a report that took more than a year’s worth of research.

In 2019, Garvie is clearly achieving expertise in the subject with new reports on police facial recognition practices, highlighting areas of concern.

“There are no rules when it comes to what images police can submit to face recognition algorithms to generate investigative leads,” the report “Garbage In, Garbage Out,” states — adding that agencies can and do submit all manner of “probe photos” of unknown and unrelated individuals to get the system to produce a match. “These images may be low-quality surveillance camera stills, social media photos with filters, and scanned photo album pictures. Records from police departments show they may also include computer-generated facial features, or composite or artist sketches.”

The Facial Identification Section (FIS) has also used a photo of a New York Knicks player to search its face recognition database for a man wanted for assault.

 The report argues that the stakes are too high in criminal investigations to rely on fundamentally unreliable inputs: “It is one thing for a company to build a face recognition system designed to help individuals find their celebrity doppelgänger or painting lookalike for entertainment purposes. It’s quite another to use these techniques to identify criminal suspects, who may be deprived of their liberty and ultimately prosecuted based on the match. Unfortunately, police departments’ reliance on questionable probe photos appears all too common.”

The New York report makes a number of recommendations — including halting the practice of using celebrities as probe images.

“As the technology behind these face recognition images continues to improve, it is natural to assume that the investigative leads become more accurate,” the report concludes. “Yet without rules governing what can — and cannot — be submitted as a probe photo, this is far from a guarantee. Garbage in will still lead to garbage out.”