Compounding Injustice: The Cascading Effect of Algorithmic Bias in Risk Assessments
The increasing pervasiveness of algorithmic tools in criminal justice has led to an increase in research, legal scholarship, and escalating scrutiny of automated approaches to consequential decision making. A key element of examination in literature focuses on racial bias in algorithmic risk assessment tools and the correlation to higher likelihoods of high bail amounts and/or pretrial detention. These two phenomena combine to initiate a cascading effect of increased likelihoods for conviction, incarceration, harsher sentencing, higher custody levels, and barriers to parole, leading to negative impacts on other factors unrelated to criminal history, all of which feed into subsequent assessment instruments for defendants who are re-arrested. This escalating cascade of algorithmic bias errors has particularly dire consequences for Black defendants, who are statistically more likely to receive higher failure-to-appear (FTA) and recidivism risk scores than white defendants, and who are thus more likely to be negatively impacted by subsequent decisions, both human and computer-aided, throughout the criminal justice process. This is periodically referred to in literature as ‘disparate impact’ but lacks a deeper examination of broad-based effects. This Article endeavors to advance that examination by looking across multiple elements of criminal procedure and beyond to aid in understanding cascading effects and consequent injustices suffered by Black defendants due to the continued automation and encoding of societal biases into the criminal justice.
Order Your Copy