Volume 59
Issue
1
Date
2022

The Solution to the Pervasive Bias and Discrimination in the Criminal Justice System: Transparent and Fair Artificial Intelligence

by Mirko Bagaric, Jennifer Svilar, Melissa Bull, Dan Hunter, & Nigel Stobbs

Algorithms are increasingly used in the criminal justice system for a range of important matters, including determining the sentence that should be imposed on offenders; whether offenders should be released early from prison; and the locations where police should patrol. The use of algorithms in this domain has been severely criticized on a number of grounds, including that they are inaccurate and discriminate against minority groups. Algorithms are used widely in relation to many other social endeavors, including flying planes and assessing eligibility for loans and insurance. In fact, most people regularly use algorithms in their day-to-day lives. Google Maps is an algorithm, as are Siri, weather forecasts, and automatic pilots. The criminal justice system is one of the few human activities which has not substantially embraced the use of algorithms. This Article explains why the criticisms that have been leveled against the use of algorithms in the criminal justice domain are flawed. The manner in which algorithms oper-ate is generally misunderstood. Algorithms are not autonomous machine applications or processes. Instead, they are developed and programmed by people and their efficacy is determined by the quality of the design process. Intelligently designed algorithms can replicate human cognitive processing, but they have a number of advantages, including the speed at which they process information. Also, because they do not have feelings, they are more objective and predictable than people in their decision-making. They are a core component of overcoming the pervasive bias and discrimination that exists in the criminal justice system. 

Read More

Subscribe to ACLR