Our Work
Founded in 2014, the Center on Privacy & Technology is a leader at the intersection of privacy, surveillance, and civil rights.
Latest Work

Letter to the FTC on Commercial Surveillance and Data Security Rulemaking
The Center signed onto a joint letter alongside 40 civil rights and technology policy advocacy organizations calling on the Federal Trade Commission to define discrimination as an unfair practice and to require testing of algorithmic systems for discrimination, as part of the Commission's Commercial Surveillance and Data Security Rulemaking. The letter also provided several recommendations for the FTC to develop concrete and specific protections for civil rights.

Comments to the White House Office of Science and Technology Policy
The Center signed onto joint comments spearheaded by the Athena Coalition, submitted to the White House Office of Science and Technology Policy's Request for Information on Automated Worker Surveillance and Management. The submission, which Senior Associate Cynthia Khoo (among others) reviewed and provided feedback on, focuses on the harmful impacts of ubiquitous surveillance and punitive automated management at Amazon on its warehouse workers and delivery drivers. The comments also include a number of recommendations to the Biden-Harris administration to implement in protecting workers from continued exploitation through automated surveillance and management.

The Intercept: LexisNexis Is Selling Personal Data to ICE
Executive Director Emily Tucker was quoted in an article in The Intercept about ICE and LexisNexis contracts: “This is really concerning,” Emily Tucker, the executive director of Georgetown Law School’s Center on Privacy and Technology, told The Intercept. Tucker compared the contract to controversial and frequently biased predictive policing software, causing heightened alarm thanks to ICE’s use of license plate databases. “Imagine if whenever a cop used PredPol to generate a ‘hot list’ the software also generated a map of the most recent movements of any vehicle associated with each person on the hot list.”

Op-Ed: Chatbots Are Creating Algorithmic Dependence
Executive Director Emily Tucker published an op-ed in Tech Policy Press entitled "Our Future Inside The Fifth Column- Or, What Chatbots Are Really For" about the corporate and capitalist goals of companies designing chatbots. Chatbots, Tucker warns, are part of a larger scheme to sow algorithmic dependence in the economic spaces most important to the public interest. “The goal is no longer to dominate crucial industries, but to convert crucial industries into owned intellectual property.”

Executive Director Emily Tucker’s Article Translated and Republished
Executive Director Emily Tucker’s article "Our Future Inside The Fifth Column - Or, What Chatbots Are Really For” was translated into Italian and published in the AI Aware magazine. In it, Tucker analyzes how major tech companies use chatbots and artificial intelligence as tools for marketing monopoly consolidation and political influence rather than for the common good. It also points out the media's lack of critical attention towards this and the dangers of corporate power accumulation. AI Aware aims to provide in-depth and accessible analysis on the ethical, social, and technical issues related to AI. The original piece, in English, was published on Tech Policy Press in June 2023.

“Sam’s Plan to Too-Late Regulate” blog
On May 16, 2023, Sam Altman, the CEO of OpenAI, testified before Congress that the government should regulate technologies being marketed as "artificial intelligence." The Privacy Center posted a blog explaining why Altman's regulation is "too-late regulation": regulation that is both too late in the tech development process and too late in the industry adoption timeline to be meaningful. Read the whole blog here: https://medium.com/center-on-privacy-technology/sams-plan-to-too-late-regulate-2e91516369f9.

Presentation at Labor Research & Action Network Workshop
Distinguished Fellow Gabrielle Rejouis spoke at an LRAN workshop, "American History, Race, Prison, and Surveillance: Atlanta’s Cop City, Extractive Economies and Amazon’s Culture of Surveillance", presenting on the history of worker surveillance. Her presentation connected the history of surveillance to current Amazon surveillance of workers and their support of law enforcement surveillance of Black communities. The presentation drew from the research for the Color of Surveillance: Monitoring Poor and Working People.

Worker Surveillance Panel at RightsCon 2023
Senior Associate Cynthia Khoo spoke on a panel at RightsCon 2023, titled, "Reclaiming and Building Worker Power in an Age of Workplace Surveillance." She joined representatives from Coworker.org, TEDIC Paraguay, and other privacy and technology policy professionals. Cynthia presented on how both privacy laws and employment / labour laws fail to protect workers from surveillance and data-driven harms, and what lawyers, other advocates, and decision-makers can do to address the issue.

“Pulling Back the Curtain on the Technologies Automating Inequities in the Criminal Legal System” blog
In recent years there has been a major shift in the criminal legal system: law enforcement authorities have been increasingly using algorithmic technologies when making critical decisions about policing and punishment. Associate Jameson Spivack wrote a blog detailing how these algorithmic technologies are reinforcing and automating biases in the criminal legal system. Read the whole blog here: https://medium.com/center-on-privacy-technology/pulling-back-the-curtain-on-the-technologies-automating-inequities-in-the-criminal-legal-system-b059b8b02342.

Reason: TSA’s Facial Recognition Tech Raises Questions About Bias and Data Security
Justice Fellow Meg Foster was quoted in Reason about TSA's face recognition pilot program: "Whenever there is a power imbalance between powers, consent is not really possible," says Foster. "How does TSA expect them to see and read an inconspicuous notice, let alone tell a TSA agent they want to opt out of face recognition? Especially if it may not be clear to them what the consequences of opting out will be."