Note from Esprii: This article focuses more on UK systems, but has examples from other systems as well. It particularly notes the differences between “solely automated systems [that] have no human involvement,” (8) and “partially automated systems assist or inform a human decision-maker.” (8) I plan to focus on the case studies from the US within this article.
“Before the Wisconsin Supreme Court, the appellant in Loomis challenged the trial court’s dismissal of his post-trial motion for relief, which argued that his due process rights had been violated by the court’s reliance on Correctional Offender Management Profiling for Alternative Sanctions (COMPAS). COMPAS uses a predictive algorithmic system to produce risk ratings or “scores” indicating the likelihood of an individual committing another crime. Those scores are, amongst other information, provided to sentencing courts to provide background on the offender to inform sentencing decisions.” (17)
- Explains more in depth what COMPAS (one of the few approved algorithms within the criminal justice system) is, and how it can affect court cases, especially the power it holds in conjunction with the court.
- This brings up another point mentioned in the Transparency section (page 135) of Fairness and Accountability: If the judges, lawyers, or officials don’t know how to properly use the system at hand, it can have dire consequences for anyone they are trying, defending, etc.
- Quote I was referring to: “some algorithmic devices in use today in criminal justice contexts do not rely on machine-learning techniques. For instance, the risk-assessment tool developed by the Arnold Foundation is based on relatively straightforward regression models. Hence, even if most people, including most judges, lawyers and accused persons, do not understand how regressions work, that is no objection to relying on a risk-assessment device of this kind.”
- Transparency is important: “The appellant argued that as a result of the proprietary nature of COMPAS and lack of access to the methodology, he and his legal team were unable to assess the scientific validity of the COMPAS assessment. It was argued that this infringed the appellant’s right to an individualised sentence and the right to be sentenced on accurate information.”
“The Supreme Court agreed with the trial court that the use of COMPAS in the appellant’s sentencing did not violate due process rights. Sentences could still be sufficiently individualised because judges retained discretion and had sufficient information to disagree with the COMPAS assessment, when appropriate. In the circumstances, its use was not determinative in the appellant’s sentencing as the consideration of it was supported by other independent factors” (17)
- This is limited again in my prior statement and backup argument that a lot of judges in the criminal justice system do not have the necessary knowledge surrounding these systems to accurately know when to disagree or agree with them.
Note from Esprii: While not all about the US case, it might be useful to use some information in the conclusion, pasted here-
45. This survey of cases demonstrates that a variety of arguments have been deployed in challenges against the use of automated decision-making by public bodies in a number of jurisdictions. Key areas of challenge relate to privacy rights, data protection rights and discrimination, in addition to raising issues of irrationality and vires. The types of systems that have been the subject of challenge range from automated correspondence to complex risk prediction systems, the inner workings of which are often unknown due to lack of access to the underlying technology. The resulting judgments from these cases suggest that there is a substantial variation in the degree to which the courts have engaged with the underlying systems that are the subject of these challenges.
46. As is evident from the proportion of the above cases in which an appeal is pending or judgment is awaited, the judicial treatment of challenges in this area seems likely to continue to evolve at a rapid pace, both domestically and in other jurisdictions.“ (20)
- There are many different cases posed, reread if resources run dry.
Hall, Claire. “Challenging Automated Decision-making by Public Bodies: Selected Case Studies from Other Jurisdictions.” Routledge Taylor & Francis Group, vol. 25, no. 1, 2020, pp. 8-20