Tech Foundations: Congressional Staff Learn the Latest at Georgetown Law

August 16, 2019

Alexandra Givens of Georgetown Law's Institute for Technology Law & Policy with D.C. Attorney General Karl Racine and Georgetown Law Professor David Vladeck at "Tech Foundations for Congressional Staff" on August 13.

Today, new technologies are created faster than regulation can keep up — and the legal issues are not always easy to predict. Georgetown Law’s Institute for Technology Law & Policy hosts an annual program for congressional staff to help address that gap. This year’s program, the Institute’s third such event, explored the business models, legal frameworks and real-world impact of new and emerging technologies.

“It’s no secret that Congress finds it challenging to keep up with new technologies”, said Alexandra Givens, the Institute’s Executive Director, who served as a senior advisor in the Senate before coming to Georgetown. “Our goal is to broaden staffers’ perspectives, and give them access to clear, current information to help them make more informed policy decisions.”

The two-day conference on August 13 and 14 drew on Georgetown Law’s deep expertise in fields including tech, consumer protection and privacy. Professor David Vladeck, who served as the head of the Federal Trade Commission’s Bureau of Consumer Protection from 2009 to 2012, led a conversation with D.C. Attorney General Karl Racine on “The Role (and Roadblocks) of State and Private Enforcement to Protect Consumer Privacy.”

Terrell McSweeny (L’04), a former commissioner of the Federal Trade Commission and current fellow at the Tech Institute, spoke about the FTC’s role, powers and future needs. Nancy Libin (L’93), now chair of the Privacy & Security + Technology practice at Davis Wright & Tremaine LLP, spoke about the realities of how businesses are implementing major privacy legislation adopted in Europe in last year.

Terrell McSweeny (L'04) speaks at a podium in front of a classroom.

Terrell McSweeny (L’04).

Professor Paul Ohm gave a presentation on “Investigating Crimes in the DarkNet” – hidden networks that conceal users’ information, making it harder for law enforcement to identify those engaged in online criminal activity.

Law enforcement use of facial recognition technologies

Visiting Professor Alvaro Bedoya and Clare Garvie (L’15) discussed the use and impact of facial recognition technologies. Bedoya is the founding director of Georgetown Law’s Center on Privacy & Technology; Garvie is a Senior Associate at the Center who has authored several groundbreaking reports on police use of facial recognition technologies.

How common is the use of facial recognition technologies by police? “Over half of all American adults are enrolled in facial recognition databases that are accessible to law enforcement and used in criminal investigations,” Garvie said. “Not because over half of us have been arrested, but because over half of us have a driver’s license that has been enrolled.”

Clare Garvie (L'15) and Visiting Professor Alvaro Bedoya in front of a classroom.

Clare Garvie (L’15) and Visiting Professor Alvaro Bedoya.

Instead of being a miracle-crime solving tool, facial recognition technology is problematic at every stage of the process. With no rules as to what can be submitted as a probe photo, police have used high-quality photos of celebrities thought to resemble a suspect; forensic sketches (“studies have shown this does not work,” Garvie said); and even edited photos in which a random person’s open eyes are substituted for a suspect’s closed ones — in the hopes that the technology will produce a match.

Images of women and people of color can be particularly susceptible to matching errors, Garvie warned, and human bias is possible when the results are interpreted. In almost all cases, defense attorneys are not told when the technology has been used to identify their clients.

Although the new technology is often compared to DNA matching or fingerprints, Bedoya noted important differences.

“A police officer cannot look across the street at a crowd, and secretly cheek-swab them,” he said. “Nor can they do so in the blink of an eye, without knowledge or consent.” The driver’s license databases, he noted, are made up of law-abiding, taxpaying citizens — not criminals.

The future of facial recognition is here, Garvie said. “It’s only going to get worse unless we legislate.”

Bedoya called for a moratorium. “We used to think regulation was the way; now we think it’s time to hit the ‘pause’ button…” Bedoya told the staffers, noting that lawmakers on both sides of the aisle have felt the same. “What more [evidence] do you want, before you have to say, ‘let’s hit the pause button’? Do you want examples of abuse [of the system]? We’ve got them… Do you want examples of bias? We’ve got them… Do you want evidence that this is used in secret…? We’ve got that, too…”

“We think it’s time to hit pause,” he told the audience, “but we will leave the policy questions for you.”