Garbage In, Garbage Out: Face Recognition on Flawed Data
What if someone misusing an algorithm could falsely incriminate you?
This situation may already be a reality for people in New York City. On multiple occasions, when blurry or flawed photos of suspects have failed to turn up good leads, analysts have instead picked a celebrity they thought looked like the suspect, then run the celebrity’s photo through their automated face recognition system looking for a lead. In a report released by the Center on Privacy & Technology, we explain how police agencies across the country misuse face recognition technology, with reference to actual use cases from the NYPD. Analysts using these systems also sometimes submit forensic sketches—a practice that has been shown to fail the vast majority of the time—and routinely doctor low-quality photos to make them clearer, including by copying and pasting facial features from someone else’s face onto a photo of an unknown suspect.