Latest Work

New Privacy Center Report Argues Face Recognition Technology Should Not Be Used for Criminal Investigations

The Privacy Center published "A Forensic Without the Science: Face Recognition in U.S. Criminal Investigations." Authored by Senior Associate Clare Garvie, the report examines how face recognition technology, a biometric, forensic investigative tool, may be particularly prone to errors, bias, and manipulation. Read the full press release.

“New Report on Face Recognition: A Forensic Without the Science” blog

On December 6, 2022, the Privacy Center released our "A Forensic Without the Science: Face Recognition in U.S. Criminal Investigations." Senior Associate Clare Garvie, the primary author on the report, summarized the highlights of the report and commented further on face recognition on our blog. Read the whole blog here.

Response to President Biden’s AI Executive Order

/

The Privacy Center signed on to comments relating to OMB's guidance following the Biden AI EO. Comments were written by Just Futures Law, Surveillance Resistance Lab, UCLA Center on Race and Digital Justice, Mijente and Media Justice, many of whom have been and continue to be partners in Privacy Center efforts. The comments focused on the OMB guidance specifically as it relates to DHS and federal acquisition and use of AI impacting immigrant communities.

A Forensic Without the Science Featured in National Association of Attorneys General Panel

Clare Garvie presented on an informational panel about face recognition alongside Patrick Grother and Gary Marchant for the 2022 National Association of Attorneys General Annual Meeting. The Center's newly released report A Forensic Without the Science: Face Recognition in the U.S. Criminal Investigations was circulated in advance as a CLE resource.

Further Comments of Support for the Stop Discrimination by Algorithms Act

The Center submitted a letter to the Federal Trade Commission's Advanced Notice of Proposed Rulemaking on Commercial Surveillance and Data Security, a key proceeding with the potential to establish long-needed protections from abusive business practices enabled by the mass collection of personal data. The Center's comments urged the Commission to keep three specific contexts of surveillance-fuelled harm and injustice in mind in its rulemaking determinations, namely worker surveillance, immigrant surveillance, and systems of policing and punishment.

“Center on Privacy & Technology Staff Picks: Holiday Recipes” blog

In anticipation of the upcoming Thanksgiving holiday, the Privacy Center staff gave their picks of their favorite holiday recipes on our blog. Read the whole blog here.

Washington Post: How Workers Can Protect Their Privacy

Senior Associate Cynthia Khoo was quoted in a Washington Post article about workers' considerations when it comes to their private communications at work. She discussed various factors that workers should be aware of with respect to their digital security and communications privacy.

“Testifying for the DC Stop Discrimination by Algorithms Act” blog

On September 22, 2022, Senior Associate Cynthia Khoo testified on behalf of the Privacy Center to support the Stop Discrimination by Algorithms Act (Bill B24–558) (SDAA) in a hearing before the Committee on Government Operations and Facilities at the Council of the District of Columbia (DC Council). Read her whole testimony and more context on our blog here.

Op-Ed: The DC Council Must Pass the Stop Discrimination by Algorithms Act

/

Senior Associate Cynthia Khoo co-authored an op-ed in the Washington Post with Daniel Jellins, staff lawyer and clinical teaching fellow at the Communications and Technology Law Clinic at Georgetown Law. The op-ed sets out the differences between algorithmic discrimination and "analog" discrimination; explains why algorithmic discrimination requires legal reform to address; and calls on DC Council to pass the Stop Discrimination by Algorithms Act.

Written Testimony Provided in Support of the Stop Discrimination by Algorithms Act

The Center submitted written testimony in support of the Stop Discrimination by Algorithms Act to the Council of the District of Columbia, co-authored by Senior Associate Cynthia Khoo, Associate Korica Simon, and Executive Director Emily Tucker. Their comments expanded on Khoo's oral testimony at hearing, further elaborating on the nature of algorithmic discrimination; arguing that business costs should not be weighed in prohibiting illegal discrimination; correcting industry talking points regarding technical definitions and the state of the relevant legal, policy, and research field; and urging DC Council to act now to reduce harm, rather than wait to follow potential future federal initiatives.