Volume 56
Issue
4
Date
2019

Pandora's Algorithmic Black Box: The Challenges of Using Algorithmic Risk Assessments in Sentencing

by Leah Wisser

If society continues to use algorithmic risk assessments as they are currently deployed, without the proper limitations and oversight, vast opacity will inevitably cloud our otherwise transparent criminal justice system and risk the introduction of various forms of bias. There is vast misunderstanding about how these algorithms work, both by society at large and the very judges who factor the risk scores into their sentencing decisions. Members of the Senate have urged the United States Sentencing Commission to conduct an independent study of the inner-workings of these algorithms and to “issue a policy statement to guide jurisdictions implement-ing these tools.” The Senate members raised concerns about fairness, racial discrimination, and lack of transparency. We must protect defendants and our justice system’s integrity from these algorithms’ flaws. Unfortunately, our constitutional framework does not provide us with the appropriate tools to address the problematic nature of algorithmic risk assessments. Therefore, I offer an administrative solution as a better way of addressing these concerns.

Keep Reading

Subscribe to ACLR