Founded in 2014, the Center on Privacy & Technology is a leader at the intersection of privacy, surveillance, and civil rights.
October 2021: Associate Cynthia Khoo submitted oral and written testimony on the Massachusetts Information Privacy Act, to the MA Joint Committee on Advanced Information Technology, the Internet and Cybersecurity. MIPA would restrict biometric and worker surveillance and algorithmic discrimination.
October 2021: News Beat highlighted the wrongful arrests of Michigan residents Michael Oliver and Robert Williams. Center Senior Associate Clare Garvie spoke with hosts about the risks to free speech and privacy posed by the widespread adoption of the technology, misuse, and its racial impacts.
September 2021: Associate Cynthia Khoo filed comments regarding “A Proposal for Identifying and Managing Bias in Artificial Intelligence” by the National Institute of Science and Technology. The submission focuses on civil rights impacts and the limits of a technical approach to algorithmic bias.
July 2021: The Center, represented by Georgetown Law's Civil Litigation Clinic, filed an amicus brief in a lawsuit against face recognition company Clearview AI alleging violations of the Illinois Biometric Information Privacy Act (BIPA).
June 2021: Associate Jameson Spivack wrote an op-ed in The Washington Post arguing that Maryland's criminal justice reform legislation should include restrictions on predictive police technologies.
May 2021: CBS 60 Minutes profiled the use of face recognition by police in the wake of three publicized misidentifications resulting from the technology. Senior Associate Clare Garvie was interviewed.
May 2021: PBS NOVA produced a segment on police face recognition risks, highlighting the misidentification of Detroit resident Michael Oliver. Senior Associate Clare Garvie was interviewed.
April 2021: The Center hosted a virtual screening of the documentary "Coded Bias." Center Senior Associate Clare Garvie provided a video introduction to the screening, connecting the themes in the film to the Center’s research and advocacy.
March 2021: A student chapter of AI4ALL, a US-based nonprofit dedicated to increasing diversity and inclusion in AI research and policy, invited Senior Associate Clare Garvie to give a webinar with Emiliano Falcon-Morano of ACLU-MA on face recognition use by police.
March 2021: Senior Associate Clare Garvie joined a panel discussion about regulations needed for the artificial intelligence sector as part of UMass Amherst's annual Computing for the Common Good symposium.
February 2021: Senior Associate Clare Garvie gave a keynote presentation about face recognition in U.S. policing for the UFIG annual meeting, an initiative at the University of New South Wales aimed at informing the debate around face recognition development and deployment.
January 2021: Center Associate Jameson Spivack was featured in a three-part podcast series, City Surveillance Watch, exploring how cities are using surveillance technologies, and the implications of these invasive tools.
January 2021: Director of Research and Advocacy Emily Tucker and Senior Associate Clare Garvie co-taught a Week One course for 1Ls on big data, face recognition, and legislative lawyering, culminating in a mock hearing where students argued the merits of a bill limiting federal access to Maryland driver data.
December 2020: The FTC issued 6(b) orders to nine social media and video streaming services requesting information, including questions about data-driven bias and discrimination. This action comes after Associate Director Laura Moy organized a coalition letter urging the FTC to include questions on data and bias in future agency studies.
December 2020: Associate Director Laura Moy and former Associate Gabrielle Rejouis released a report through the Day One Project, “Addressing Challenges at the Intersection of Civil Rights and Technology,” outlining critical inefficiencies when it comes to federal agencies actively combating such issues.
December 2020: Following a screening of Coded Bias, Senior Associate Clare Garvie joined a discussion with Data for Black Lives National Organizing Director Tawana Petty about surveillance in Detroit hosted by Detroit Free Press.
November 2020: The Center files an amicus brief arguing that Baltimore Police Department’s warrantless aerial surveillance program is unconstitutional.
November 2020: Policy Associate Jameson Spivack joined an expert panel at a symposium "Artificial Intelligence: Bias and the Implications of Algorithms, Facial Recognition, and Machine Learning Technologies" convened by American University's Journal of Gender, Social Policy & the Law.
November 2020: Senior Associate Clare Garvie joined an expert panel discussing Coded Bias, a documentary about face recognition bias. Other panelists included director and producer Shalini Kantayya, MIT researcher Joy Buolamwini and ACLU's Kade Crawford; moderated by CNN's Van Jones.
October 2020: Center Policy Associate Jameson Spivack spoke as part of an expert panel "Racial Biases in Artificial Intelligence" for a webinar hosted by University of Maryland business school.
October 2020: The Perpetual Line-Up, the Center's foundational face recognition report, was recognized by the inaugural Tech Spotlight at the Harvard Kennedy School's Belfer Center, highlighting initiatives that orient technology to a public purpose and ensure its safety, fairness, and inclusivity.
October 2020: Senior Associate discussed the risks of police face recognition use in a long-form Q&A with Document Journal.
September 2020: The Government Accountability Office issued a report finding that airport face scans did not improve airport agents' day-to-day capabilities. The Center's December 2017 report, Not Ready for Takeoff, found that airport face scans were unjustified.
September 2020: Founding Director Alvaro Bedoya wrote a piece in Slate on why people should care about Palantir's direct listing and what it means for immigrants and the native-born alike.
September 2020: The Center participated in the Digital Day of Action #NoTechForICE campaign around Palantir's direct listing. The campaign raised awareness that Palantir is a company that profits off the surveillance and exploitation of communities targeted by ICE and the Department of Defense.
September 2020: Policy Associate Jameson Spivack joined a panel of experts to discuss face recognition and racial bias hosted by the University of Minnesota Law School.
September 2020: Associate Director Laura Moy organized a coalition letter urging the FTC to include questions on data, bias, and disparate impact in any new studies the agency undertakes. The letter was signed by 27 civil rights, digital rights, racial justice organizations, and consumer groups.
September 2020: Policy Associate Jameson Spivack and Senior Associate Clare Garvie’s paper analyzing how lawmakers around the US are trying to regulate face recognition was published in the AI Now Institute’s biometric technologies report.
September 2020: Senior Associate Clare Garvie spoke on a panel paying tribute to the late MI State Rep. Isaac Robinson about the fight for racial justice in Detroit, hosted by the Wright Museum and Detroit Public Television, and moderated by Tawana Petty of the Detroit Community Technology Project.
September 2020: Senior Associate Clare Garvie participated in an NACDL training webinar on face recognition in criminal trials, alongside computer scientist Dr. Arun Ross and ACLU Michigan senior staff attorney Phil Mayor.
September 2020: Senior Associate Clare Garvie was interviewed for the 70 Million podcast about police use of face recognition in Detroit and ongoing efforts of community members to put a stop to it.
August 2020: In an episode of the podcast "In Machines We Trust" from MIT Tech Review, Policy Associate Jameson Spivack discusses how police can use face recognition to discourage protest and free speech.
August 2020: In an episode of the podcast series Banned in PDX, Policy Associate Jameson Spivack discusses police use of face recognition on protesters, with a focus on Portland, OR.
August 2020: Senior Associate Clare Garvie spoke to Digital Privacy News about developments in the ongoing effort to bring police face recognition under control since the Center released The Perpetual Line-Up in 2016.
August 2020: Senior Associate Clare Garvie participated in a town hall on face recognition hosted by the Project on Government Oversight with Malkia Cyril of the Center for Media Justice, Matt Cagle of ACLU NorCal, POGO's Jake Laperruque, Rep. Jimmy Gomez, and Sen. Jeff Merkley.
July 2020: Policy Associate Jameson Spivack spoke on the RightsCon 2020 panel "The Revolution Will Be Tracked: How to Protect Activism in the Age of Mass Surveillance" about mass surveillance, face recognition, and protests.
July 2020: Policy Associate Jameson Spivack wrote an op-ed for Route Fifty about the lack of police face recognition regulation, how this allows police to surveil protesters, and the potential consequences for civil rights and liberties.
July 2020: Senior Associate Clare Garvie was interviewed on CounterSpin about police face recognition misuse, misidentifications, and the risks to free speech and association in light of nationwide protests.
July 2020: Senior Associate Clare Garvie joined a panel on face recognition and public protest hosted by Al Jazeera's The Stream, along with Silkie Carlo of Big Brother Watch and reporter Mary Hui.
July 2020: Following a presentation by artist Adam Chin on his latest exhibition Front and Profile, Senior Associate Clare Garvie joined a conversation hosted by SF Camerawork about AI, criminal justice, and the role of art in exposing and examining these issues.
July 2020: The Center endorsed legislation introduced by Sens. Markey and Merkeley and Reps. Jayapal and Pressley to stop government use of biometric technology, including facial recognition tools.
July 2020: The Center was one of 40 organizations that wrote a letter urging Congress to pass strong police face recognition legislation, stop continued federal funding for the technology, and ensure policing reforms include face recognition prohibitions. Photo by Joshua Sukoff on Unsplash
June 2020: The Perpetual Line-Up, the Center's foundational face recognition report, was cited in the "Last Week Tonight" segment on face recognition technology. Senior Associate Clare Garvie also contributed to the reporting on background.
June 2020: Responding to news of an arrest made on the basis of a face recognition misidentification, Senior Associate Clare Garvie wrote an article for ACLU's website about the likelihood that this has happened before—and will continue to happen.
June 2020: Senior Associate Clare Garvie participated on a panel about police surveillance in Detroit organized by Stanford's Digital Civil Society Lab. Other speakers included Tawana Petty, Eric Williams, and Cierra Robson.
June 2020: The Center joined over 100 organizations calling on Congress to cease federal funding for police surveillance technology used to criminalize dissent. Photo by Liam Edwards on Unsplash.
June 2020: The Center joined over 80 other organizations in calling for information technologies deployed to combat the spread of COVID-19 to preserve civil rights and privacy. Photo by Oleg Magni on Unsplash.
June 2020: The Center launched the "A Seat at the Table: Creating Inclusive Tech Policy Organizations" co-authored by Center Associate Gabrielle Rejouis. The guide provides tips on advertising practices, creating opportunities, and retaining diverse employees. Image Credit: CreateHER Stock.
May 2020: The Center coordinated a letter with 9 other organizations urging state governors to issue policies protecting essential workers during the pandemic and to petition OSHA to issue an emergency temporary standard to that end. Photo by Clark Van Der Beken on Unsplash.
April 2020: Policy Associate Jameson Spivack joined the Carnegie Council podcast for a discussion on the risks of police face recognition, proposed legislation to regulate it, and privacy in a time of increased health surveillance.
April 2020: Senior Associate Clare Garvie participated in an online panel organized by IAPP and FPF on the current state of face recognition legislation in the U.S. Clare spoke alongside Brenda Leong of FPF, James Loudermilk of IDEMIA, and Hector Dominguez Aguirre with the City of Portland.
March 2020: The Center joined 29 other organizations in signing a letter to the United Federation of Teachers (UFT) urging it to cancel any existing contracts for face surveillance. In response to the letter, UFT committed to opposing face recognition use in NYC schools.
March 2020: Policy Associate Jameson Spivack testified before the Maryland House of Delegates in supporting of legislation that would place a moratorium on law enforcement use of face recognition technology. He testified alongside advocate Amara Majeed, among others.
March 2020: Associate Gabrielle Rejouis argued algorithmic accountability must incorporate civil rights principles by examining Google algorithms that discriminated against Black people in an op-ed for Medium's OneZero publication. Photo by Mitchell Luo on Unsplash.
March 2020: Founding Director Alvaro Bedoya was named to The Washington Post's Technology 202 Network panel of technology experts.
March 2020: Policy Associate Jameson Spivack argued in The Baltimore Sun that Maryland has one of the most invasive face recognition systems in the nation, and that it was time for the legislature to put a moratorium on police face recognition use.
March 2020: EFF has created a website to help the public learn how their face scans are used. The website draws on information uncovered by the Center in previously unpublished FOIA response documents.
February 2020: Senior Associate Harrison Rudolph testified before the Maryland House of Delegates and Maryland Senate in support of legislation that would require ICE to obtain a warrant before accessing driver data.
February 2020: In June 2019 Associate Director Laura Moy, along with other civil society groups, urged the FCC to investigate major wireless providers' treatment of consumer location data. As a result, in February 2020, the FCC fined these providers for selling customer location data.
February 2020: Citing the Center's research and privacy, accuracy, and bias concerns, the Globe's Editorial Board recommends that the Massachusetts legislature and lawmakers across the country hit the pause button on police face recognition use.
December 2019: Citing the Center's 2016 report, The Perpetual Line-Up, as motivation, NIST released a report on differential error rates by race, sex, and age in face recognition algorithms. This fulfills one of our 2016 recommendations that NIST test for racial bias in the technology.
December 2019: Senior Associate Clare Garvie was invited to lead a seminar for Public Square in Santa Barbara, CA. The seminar explored legislative solutions to the risks posed by police use of face recognition technology.
November 2019: The Center released the "Worker Privacy Act," a discussion draft bill on worker privacy protections which increases employee input, prohibits certain uses of data, expands who is considered an employer, and designates a federal office to address worker privacy concerns.
October 2019: Senior Associate Clare Garvie participated in a public panel and government workshop on the legal and ethical issues surrouring face recognition technology in Wellington, New Zealand. The events were hosted by Victoria University Wellington.
October 2019: Policy Associate Jameson Spivack testified before the Massachusetts Legislature's Joint Committee on the Judiciary in support of legislation proposing a moratorium on government use of face recognition. Jameson was part of an expert panel discussing the risks of the technology.
October 2019: Investigative reporter McKenzie Funk published an exposé in The New York Times, "How ICE Picks Its Targets in the Surveillance Age." Funk's article cited the Center's research on ICE requests for face recognition searches of state DMV databases.
October 2019: “We’re all getting comfortable with face recognition,” Senior Associate Clare Garvie warns in a New York Times video op-ed. “But the convenience is blinding us to how risky this technology actually is, and how it is being used without us realizing.”
September 2019: Senior Associate Clare Garvie presented to Portland's City Council and Mayor Ted Wheeler about the risks posed by police face recognition. Portland is considering restrictions on the use of face recogniton by police and private companies.
September 2019: Associate Harrison Rudolph testified before the Utah Legislature's Government Operations Interim Committee about face recognition technology. The meeting followed reports about the Center's research showing Utah's ID photos had been routinely scanned by law enforcement.
September 2019: A bipartisan, bicameral group of legislators sent an oversight letter to the DHS and FBI concerning government use of face recognition technology. The letter followed reports about Center research showing how federal agencies requested face recognition searches of state databases.
September 2019: Executive Director Laura Moy joined other experts on a new working group paper on moving the encryption policy conversation forward. The paper offers suggestions for this issue along with principles and use cases to be used to evaluate emerging policy proposals related to encryption.
September 2019: Law Fellow Gabrielle Rejouis published an op-ed in Slate's Future Tense on the dangers of workplace surveillance and the need for privacy protections.
August 2019: The Center's research was featured in a front-page article in The New York Times about the NYPD's inclusion of juvenile photos in its face recognition program.
August 2019: Executive Director Laura Moy spoke on the Marketplace "Make Me Smart" podcast about what privacy means in the era of big tech.
July 2019: After The Washington Post reported the Center's research on ICE's FR searches on DMV databases in Utah and Washington, Utah Lt. Gov. Cox expressed concern and said the they would be investigating ICE's access. The Center's findings left lawmakers and civil society groups in Utah outraged.
July 2019: Citing America Under Watch, Detroit Free Press details the problems with FRT on Green Light cameras. The Center's research has been central to the fight to limit FRT use by law enforcement in Michigan, including two bills in the MI legislature, House Bill 4810 and Senate Bill 0342.
July 2019: Law Fellow Gabrielle Rejouis spoke on the IGF-USA "Which National Privacy Strategy Should the US adopt?" panel. The panel examined different privacy frameworks and the benefits of each approach.
July 2019: Law Fellow Gabrielle Rejouis spoke on the OTI "Paying for Privacy" panel discussing how federal privacy legislation may impact current online business models.
July 2019: Center Freedom of Information Act requests showed that ICE had asked at least three states that offer undocumented people driver’s licenses to run face recognition searches of their DMV photos. The documents were released via an exclusive with The Washington Post.
June 2019: The Center, with New America's Open Technology Institute and Free Press, filed a complaint with the FCC against wireless carriers for sharing customers' location information without consent. The Center was represented by the Samuelson-Glushko Technology Law & Policy Clinic at Colorado Law.
June 2019: Law Fellow Gabrielle Rejouis wrote an op-ed connecting the history of Juneteenth, when news of emancipation reached Texas slaves, to current digital disruptions of communication, including optimization algorithms and disinformation.
May 2019: Citing the Center's reports, The New York Times' editorial board called for regulation of NYPD's use of face recognition, warning that "dragnets become tools aimed at minority populations." Earlier, columnist Farhad Manjoo called for a moratorium on the technology based on Center reports.
May 2019: Senior Associate Clare Garvie testified before the House Committee on Oversight and Reform. She argued that in the absence of regulation police use of face recognition poses risks to our First, Fourth, and Fourteenth Amendment rights. Because of those risks, a moratorium is appropriate.
May 2019: Senior Associate Clare Garvie spoke with NPR's On Point about the recent San Francisco face recognition ban and the privacy and civil liberties concerns surrounding police use of the technology.
May 2019: Senior Associate Clare Garvie and Executive Director Laura Moy explained how police agencies in Chicago and Detroit have purchased citywide face surveillance networks that are capable of scanning the faces of city residents in real time as they walk down the street.
May 2019: Senior Associate Clare Garvie explained how police agencies across the country misuse face recognition technology, referencing actual use cases from the NYPD. Analysts using these systems sometimes submit forensic sketches and routinely doctor low-quality photos to make them clearer.
May 2019: Executive Director Laura Moy delivered a presentation on "Equity and Policing Technologies—The Use of Predictive Policing, Face Surveillance, and Cell-Site Simulators" at the annual symposium of the Washington State Supreme Court.
April 2019: Executive Director Laura Moy has been part of a working group on encryption policy since 2018. The working group released two papers: one on "Likely Future Adoption of User-Controlled Encryption," and one on "Implications of Quantum Computing for Encryption Policy."
April 2019: The Center joined 25 other digital rights and civil rights organizations elevating particular harms experienced by vulnerable communities. Addressed to the House and Senate Commerce committees, the letter raised key provisions data practices legislation should include.
April 2019: The Center endorsed the Algorithmic Accountability Act of 2019, introduced by Sen. Ron Wyden (D-OR), Sen. Cory Booker (D-NJ), and Rep. Yvette Clarke (D-NY). The bill requires companies to evaluate the algorithms they use for bias.
April 2019: Associate Harrison Rudolph endorsed Rep. Debbie Wasserman Schultz's (D-FL) bill, the Families, Not Facilities Act. The bill would prohibit the use of children's information to find and deport their loved ones.
April 2019: Founding Director Alvaro Bedoya delivered the U.S. Senator Dennis Chavez Endowed Lecture on Law & Civil Rights at the University of New Mexico School of Law. In his lecture, he drew upon the research underlying several Color of Surveillance conferences to argue that privacy should be considered a civil right, not just a civil liberty.
April 2019: In an blog post sent to close to half a million Twitter followers, Shahid Buttar of the Electronic Frontier Foundation promoted our founding director's Chavez lecture on the connection between privacy and civil rights. Buttar argued that the lecture, along with the Center's Color of Surveillance conference series, are "key pieces of a growing effort to ensure that privacy and protection from surveillance are seen as part of defending civil rights."
April 2019: The Center's Policy Associate Jameson Spivack was interviewed and quoted in The Hill about recent legislative efforts to combat the potential discriminatory effects of AI. "I think that any legislation needs to recognize that while these technologies affect everyone, they disproportionately affect vulnerable people."
March 2019: The Center joined ACLU, EFF, and Innocence Project in filing an amicus brief in support of Willie Allen Lynch, petitioner to the Florida Supreme Court. The brief argues that the state failed in its obligation to disclose information about its use of face recognition technology to the petitioner.
March 2019: Senior Associate Clare Garvie participated in a workshop at the Cleveland-Marshall College of Law on regulating the use of face recognition by law enforcement and commercial entities.
March 2019: The Center's Executive Director, Laura Moy, delivered a talk at SXSW on location privacy. She explained the growing scale and sophistication of numerous location tracking technologies, and made the case for policies to salvage our disappearing location privacy.
February 2019: The Center joined 42 other digital rights and civil rights organizations calling for Congress to prioritize civil rights debates, hearings, and legislation.
February 2019: The Center drafted, organized, and filed comments, signed by ten other organizations, elevating the need for proactive action in response to the widespread use of algorithmic decision-making. The Center argued action must be taken to curb unavoidable, discriminatory harms to marginalized communities.
February 2019: A federal law was signed prohibiting the use of children's information for deportation purposes until September 30, as part of the 2019 Homeland Security Appropriations Bill. The law followed a November 2018 letter, coordinated by the Center and co-signed by 111 other NGOs, calling for the termination of an interagency agreement that used children's information to find and deport their relatives.
February 2019: The Center signed on to a letter from 47 organizations calling on legislators to protect civil rights, equity, and equal opportunity in the digital ecosystem. The letter draws directly from the Civil Rights Principles for the Era of Big Data released in 2014 and includes a call for for fairness in automated decisions and an end to high-tech profiling.
January 2019: Executive Director Laura Moy spoke at an event co-hosted by Next Century Cities, the American Action Forum, and Public Knowledge exploring opportunities for bipartisan action on technology policy issues in the new Congress.
January 2019: Executive Director Laura Moy spoke at the second annual Data for Black Lives conference on a panel discussing "Abolition in the Age of Big Data."
December 2018: The Center worked with computer scientist Joy Buolamwini to create a pledge for vendors of automated facial analysis technologies to sign signalling their commitment to responsibility and accountability.
December 2018: The Washington Post's Editorial Board reiterated its call for Congress to step in and decide the line between acceptable and unacceptable uses of face recognition technology, after the Department of Homeland Security announced a real-time face surveillance pilot at the White House.
December 2018: 15 Democratic senators, led by Kirsten Gillibrand (D-NY), sent a letter to two federal agencies calling for them to stop using immigrant children's data to deport their relatives. The letter followed a public pressure campaign coordinated by the Center and its allies on the Immigrant Surveillance Working Group.
December 2018: The New Yorker talked to Senior Associate Clare Garvie for an in-depth examination into how face recognition is revolutionizing everything from farming in Ireland to policing in the United States.
December 2018: Executive Director Laura Moy filed comments relevant to a planned Federal Trade Commission hearing on competition and consumer protection in the 21st Century. She argued that discriminatory data practices should not be allowed and that when consumers cannot avoid sharing their information, heightened privacy protections should apply.
November 2018: The Center drafted, organized, and filed comments, signed by thirteen other organizations, regarding the administration's use of children's information to deport their relatives. The comments explain that deporting families using information collected to place unaccompanied children is not only inhumane, but also unlawful and poor policy.
November 2018: The Center coordinated a letter to the Departments of Health and Human Services and Homeland Security calling for the recission of an interagency agreement that uses children's information to deport their relatives. 111 other civil rights and civil liberties organizations signed on to the letter, which received coverage in the Associated Press.
November 2018: The Center submitted comments to the National Telecommunications and Information Administration urging the agency to move further in the direction of strong consumer protection as it defines the privacy outcomes and high-level goals that this administration will prioritize.
November 2018: The Center co-drafted principles outlining essential components to be included in comprehensive privacy legislation. The principles were backed by 34 organizations.
November 2018: Senior Associate Clare Garvie spoke at the National Institute of Standards and Technology's first conference, on a panel about technical factors affecting the deployment and use of face recognition technology. Clare's remarks focused on the real-world consequences of differential error rates, including in the law enforcement context.
November 2018: The Wall Street Journal featured Senior Associate Clare Garvie in an episode of its “Moving Upstream” video series. In the interview, Clare discusses the use of face recognition technology by schools, police departments, border security organizations, and others.
November 2018: Founding Director Alvaro Bedoya reflected on artist Trevor Paglen's "Sights Unseen" exhibition at a panel convened by the Smithsonian Museum.
October 2018: Executive Director Laura Moy testified about consumer privacy before the Senate Commerce Committee. She called for increased attention to commercial data practices that can lead to societal harms, such as discrimination, erosion of trust online, amplification of hate speech, and dissemination of propaganda, misinformation, and disinformation.
October 2018: On October 5, 2018, a federal law was signed requiring privacy and racial bias assessments of the federal government's use of biometric technologies at airports—the first ever federal law requiring artificial intelligence bias testing. The law was enacted following the Center's December 2017 report, Not Ready For Takeoff, which found privacy and bias problems in these deployments.
September 2018: The Office of the Inspector General for the Department of Homeland Security conducted an audit of the agency's use of face scans at airport departure gates that closely tracked and validated many of the concerns raised in the Center's Not Ready for Takeoff report. OIG reported, among other things, that the program exhibits age bias, causes traveler delays, and may end up being far more costly than initial estimates.
September 2018: Executive Director Laura Moy spoke on a panel hosted by New America on police surveillance and the adoption—by a growing number of cities—of ordinances that help create opportunities for communities to exercise control over the surveillance technologies their police agencies have and use.
September 2018: Senior Associate Clare Garvie participated in a training webinar for public defenders on police use of face recognition technology convened by the National Association of Criminal Defense Lawyers.
August 2018: The Center drafted, coordinated, and filed comments signed by 16 organizations in response to the Federal Trade Commission's inquiry on 21st Century competition and consumer protection. The comments highlight the harms faced by marginalized communities on online platforms.
August 2018: Senior Associate Clare Garvie conducted a training on police use of face recognition technology to the Ninth Judicial Circuit 2018 Defender Summer School at Barry University in Orlando, FL.
July 2018: Executive Director Laura Moy testified about consumer privacy before the House Energy & Commerce Committee Subcommittee on Communications and Technology. In her testimony, Laura called for privacy legislation that establishes rulemaking authority and strong enforcement, and does not eliminate existing protections for consumers' data
July 2018: The Center hosted a half-day event discussing the implications of the Supreme Court's decision in Carpenter v. United States, a landmark case about the privacy of cell phone location information.
July 2018: The Center hosted a half-day event discussing the implications of the Supreme Court's decision in Carpenter v. United States, a landmark case about the privacy of cell phone location information.
July 2018: The third annual Color of Surveillance conference delved into the surveillance of religious minorities in the United States. In addition to discussions about this issue's contemporary impact, the conference featured a historian who spoke about surveillance of the Pilgrims in England.
July 2018: Senior Associate Clare Garvie wrote an op-ed for The Washington Post about how face surveillance technology risks changing our expectations of privacy, our right not to be investigated unless suspected of wrongdoing, and our freedom from deeply flawed policing practices.
July 2018: Citing The Perpetual Line-Up, the Editorial Board of The Washington Post endorsed the Center's view that in the ever-expanding field of face recognition, Congress should regulate, finding ways “to balance the public benefits of facial recognition with the obvious privacy concerns.”
June 2018: Executive Director Alvaro Bedoya argued in an op-ed for The New York Times that "the impact of consumer tracking varies greatly by race, class, and power."
May 2018: Deputy Director Laura Moy responded in Slate to a new science fiction short story by Meg Elison. In her reaction, Laura reflected on the story's themes of prejudice, outright racism, and the role of government surveillance in maintaining systems of oppression.
May 2018: After months of advocacy coordinated by the Center and others in the Immigrant Surveillance Working Group, the Department of Homeland Security formally dropped its "Extreme Vetting Initiative." It would have automatically and continuously scanned American immigrants' social media posts to flag 10,000 individuals annually for deportation investigations.
April 2018: Key ranking members on the U.S. House Homeland Security Committee called on the Department of Homeland Security to drop its plan to use machine learning to automatically screen the social media posts of American immigrants.
April 2018: The Center co-wrote a 42-organization coalition letter to Axon's new "AI Ethics Board." The letter urges the board to center the experiences of policed communities in its process, and argues that integrating face surveillance with body-worn cameras would be "categorically unethical."
April 2018: For the fourth year, Georgetown Law and MIT students convened to pitch experts on the proposed privacy legislation they drafted in a course co-taught by Alvaro Bedoya, Laura Moy, and David Vladeck. The judges' panel included a state legislator, the former general counsel of the FBI, and representatives of the ACLU and the Department of Justice.
April 2018: In a New York Times op-ed, Executive Director Alvaro Bedoya warns against rushing to pass broad federal privacy legislation, cautioning that it could be easily co-opted by powerful interests.
April 2018: Clare Garvie and Executive Director Alvaro Bedoya, along with Jonathan Frankle, presented the findings of The Perpetual Line-Up at a symposium on face recognition and big data organized by Boston University's Division of Emerging Media Studies in their College of Communication.
March 2018: Citing the AI Experts Letter coordinated by the Immigrant Surveillance Working Group, the Congressional Black Caucus called on DHS to drop its "Extreme Vetting Initiative," a plan to use machine learning to automatically vet American immigrants' online activities, and flag a minimum of 10,000 individuals annually for deportation investigations.
December 2017: Senators Edward Markey (MA) and Mike Lee (UT) sent a letter urging the Department of Homeland Security to "stop the expansion" of its Biometric Exit program and to address privacy concerns about the program. The letter cited the Center's December 2017 report on the program, Not Ready for Takeoff, which raised many of the same concerns.
December 2017: The Center released a report on the Department of Homeland Security's airport face scanning program, finding that the program never was justified, may violate federal law, is technically flawed, and has not been sufficiently evaluated for bias. The report recommends a suspension of the program pending correction of these problems.
December 2017: Clare Garvie and Founding Director Alvaro Bedoya spoke at Cato Insitute's annual surveillance conference about the "Digital Muslim Ban" and the ever-increasing use of face recognition technology by police.
November 2017: Deputy Director Laura Moy testified on algorithmic decision-making and privacy before two subcommittees of the House Energy & Commerce Committee. She called for consumer protections that are forward-looking, flexible, strongly enforced, and contextually appropriate.
November 2017: The Center co-wrote and coordinated a letter signed by 50 computer scientists and AI specialists that called on the Department of Homeland Security to halt its planned use of machine learning to screen immigrants' social media posts. The Center also co-wrote and coordinated a letter signed by over 50 NGOs denouncing the program.
October 2017: Deputy Director Laura Moy testified on privacy and data security before the House Finance Committee. She called for enhanced rulemaking and enforcement authority for federal agencies that oversee data security, and cautioned Congress against establishing new data security protections that eliminate important existing protections.
September 2017: In an op-ed published in The Guardian, Clare Garvie argued that Apple’s incorporation of face recognition into the iPhone X may lead to a dangerous complacency to the risks of the pervasive deployment of the technology.
June 2017: The second annual Color of Surveillance conference examined the issue of government surveillance of American immigrants. The event encompassed historical perspectives of immigrant surveillance in the 20th and 21st Centuries and included a discussion with Professor Xiaoxing Xi, a US-based physics professor who was falsely accused of being a spy.
May 2017: The Center sued the New York City Police Department under the state's Freedom of Information Law for the department's refusal to disclose records pertaining to its use of face recognition technology. Litigation is ongoing.
March 2017: Executive Director Alvaro Bedoya testified before the House Oversight Committee about law enforcement's use of face recognition technology, with a focus on the threat the technology poses to privacy and civil rights. Chairman Jason Chaffetz (R-UT) called the hearing in direct response to the Center's October 2016 Perpetual Line-Up report.
March 2017: On the same day as the U.S. House Oversight hearing on law enforcement's use of face recognition, Executive Director Alvaro Bedoya penned an op-ed for The Washington Post describing the dangers of a society where every face is scanned. "Perhaps you have nothing to hide. But do you resemble someone who does?"
October 2016: The Center conducted a year-long investigation into how police use face recognition technology. The ensuing report is the most comprehensive survey to date on the topic and the risks this technology poses to privacy, civil liberties, and civil rights. It contains recommendations to police, legislators, and others, as well as a model state and federal bill that would control the use of face recognition technologies.
June 2016: The Center built and mobilized a coalition to press the FBI to reject changes that would enshroud its massive fingerprint database in secrecy.
May 2016: Google publicly announces that it will no longer serve ads for payday loans, financial products that routinely harm low-income people, and that were being targeted to vulnerable consumers who searched for terms like "I need money for rent" and "I need money for groceries." This comes after months of engagement with the Center, The Leadership Conference on Civil and Human Rights, Upturn, and other allies.
April 2016: The inaugural Color of Surveillance conference focused on the disproportionate amount of government surveillance on the African American community. It hosted robust conversations on the historic and current surveillance of this group, including a debate between a Pulitzer-winning MLK biographer and the general counsel of the FBI.
January 2016: In an essay for Slate, Executive Director Alvaro Bedoya argued that surveillance debates must reckon with "the color of surveillance"—the disparate impact of government monitoring. "Across our history and to this day, people of color have been the disproportionate victims of unjust surveillance."
December 2015: Clare Garvie and Founding Director Alvaro Bedoya filed a comment which examines how online lead generation creates and perpetuates the disparate impact of payday loans on African American borrowers. They urged the FTC to use its authority under ECOA and Section 5 of the FTC Act to investigate and bring enforcement actions against responsible companies.