Criminal Justice in the Age of AI: Addressing Bias in Predictive Algorithms Used by Courts
The Ethics Gap in the Engineering of the Future
ISBN: 978-1-83797-636-2, eISBN: 978-1-83797-635-5
Publication date: 25 November 2024
Abstract
As artificial intelligence and machine learning become increasingly integrated into daily life, both individuals and institutions are growing dependent on these technologies. However, it's crucial to acknowledge that such advancements can introduce potential flaws or vulnerabilities. A case in point is the investigation conducted by the non-profit organization ProPublica into the COMPAS (Correctional Offender Management Profiling for Alternative Sanctions) risk assessment tool – a tool widely used by US courts to assess the likelihood of a defendant reoffending. To address the issue of underlying biases, including racial biases, which can lead to inaccurate predictions and significant social harm, we are delving into the current literature on algorithmic bias in decision systems. We are also exploring the evolving considerations of fairness and accountability in machine learning. Specifically, within the realm of predictive policing algorithms employed in the criminal justice system, our focus is on recent studies aimed at mitigating biases in algorithmic decision-making. This involves reassessing recidivism rates and implementing adversarial debiasing in conjunction with fairness metrics.
Keywords
Citation
Karthikeyan, R., Yi, C. and Boudourides, M. (2024), "Criminal Justice in the Age of AI: Addressing Bias in Predictive Algorithms Used by Courts", Stelios, S. and Theologou, K. (Ed.) The Ethics Gap in the Engineering of the Future, Emerald Publishing Limited, Leeds, pp. 27-50. https://doi.org/10.1108/978-1-83797-635-520241003
Publisher
:Emerald Publishing Limited
Copyright © 2025 Rahulrajan Karthikeyan, Chieh Yi and Moses Boudourides. Published under exclusive licence by Emerald Publishing Limited