Over the past few years, criminal justice algorithms, also known as “algorithmic risk assessments”, are being increasingly taken into account by courts in order to determine the offender’s tendency to recidivism. Given that the criteria applied are beyond the offender’s control (such as his or her socioeconomic status, family background, gender, neighbourhood crime rate) and the methodology applied remains more often than not obscure, their integration to the criminal sentencing procedure is being vastly debated, as issues of fairness, accountability and transparency are being raised.[1]
This matter shall be treated through the presentation of a leading case, State v Loomis,[2] held by the Wisconsin Supreme Court on July 13th, 2016 –being the first to address to such an extent the constitutionality of using risk-assessment algorithms at sentencing and the second to raise the subject up to this day[3].
Firstly, I will provide a brief summary of the case; then I will examine some keynote arguments voiced by the court; finally, I will provide a conclusion commentary, related to the possible future impact of the case.
I. The case
In early 2013, Wisconsin charged Eric Loomis with five criminal counts related to a drive-by shooting in La Crosse. Loomis denied participating in the shooting, but admitted to driving the same car involved later that evening. He pleaded guilty to two lesser charges: “attempting to flee a traffic officer and operating a motor vehicle without the owner’s consent.” The trial court sentenced Loomis to six years of imprisonment and five years of extended supervision, partly based on a COMPAS risk assessment[4] - a private company’s proprietary software (Northpointe, Inc.) which placed Loomis within a high recidivism risk in comparison to similar population groups. The assessment in question included an interview and his personal criminal history, its methodology being a trade secret; Loomis subsequently filed a motion stressing the violation of his due process rights[5], that is his right to a fair trial, as well as his right to an individualised sentence, arguing that the software’s nature impeded him from challenging its scientific accuracy and that it unconstitutionally took gender into consideration. The Wisconsin Court of Appeals certified the appeal to the Wisconsin Supreme Court.
II. The court’s reasoning and the progress of the case
Writing for the Court, Justice Ann Walsh Bradley highlighted the need of adaptation to evolving technological assistances: “As data changes, our use of evidence-based tools will have to change as well. The justice system must keep up with the research and continuously assess the use of these tools.” [6]
In response to the accuracy argument, the Supreme Court acknowledged that the proprietary nature of COMPAS prevented Loomis from seeing exactly how his score was calculated. However, since most of the information the algorithm used came from a questionnaire that he completed and public records, the Court concluded that he did have an opportunity to verify whether the information was accurate or not. Furthermore, it suggested that a due process challenge might succeed, if the risk assessment score were the determinative or sole factor considered.
Lastly, the Court held that Loomis had not met the burden of proving that the consideration of his gender was actually decisive for the sentence imposed, noting that such consideration is nevertheless needed: “Regardless of whether gender is used as a criminogenic factor or solely for statistical norming, Loomis objects to any use of gender in calculating COMPAS's risk scores. In response, the State contends that considering gender in a COMPAS risk assessment is necessary to achieve statistical accuracy. The State argues that because men and women have different rates of recidivism and different rehabilitation potential, a gender neutral risk assessment would provide inaccurate results for both men and women. […] Instead, it appears that any risk assessment tool which fails to differentiate between men and women will misclassify both genders.” [7]
The Court additionally gave a general overview of the acceptable COMPAS assessments uses (diverting low-risk prison-bound offenders to a non-prison alternative, assessing whether an offender can be supervised safely and effectively in the community and imposing terms and conditions of probation, supervision, and responses to violations)[8] and highlighted the caution a judge should possess when treating risk assessments, mandating that pre-sentencing investigation (PSI) reports containing COMPAS risk assessments must make certain disclosures to sentencing courts. This “written advisement of its limitations” should explain that:
- COMPAS is a proprietary tool, which has prevented the disclosure of specific information about the weights of the factors or how risk scores are calculated;
- COMPAS scores are based on group data, and therefore identify groups with characteristics that make them high-risk offenders, not particular high-risk individuals;
- Several studies have suggested that the COMPAS algorithm may be biased in how it classifies minority offenders;
- COMPAS compares defendants to a national sample, but has not completed a cross-validation study for a Wisconsin population, and tools like this must be constantly monitored and updated for accuracy as populations change; and
- COMPAS was not originally developed for use at sentencing. On the contrary, it was destined to assist post-sentencing
These five warnings were destined to generate a certain judicial skepticism with regard to the accuracy of this tool.[9] Nevertheless, the ruling seems vague as it does not indicate exactly how judges should evaluate them. The problem seems to be even weightier when considered that, in most cases, individuals tend to abide by and trust algorithmic data.[10]
Justice Shirley Abrahamson concurred, but criticised the Court’s decision to deny Northpointe, the company that developed this program, the opportunity to file an amicus brief. She remarked that “in considering COMPAS (or other risk assessment tools) in sentencing, a circuit court must set forth on the record a meaningful process of reasoning addressing the relevance, strengths, and weaknesses of the risk assessment tool.” She distinctively pointed out that “[such] denial was a mistake. The court needed all the help it could get.”[11]
It is in this case noteworthy that the tool used in Loomis’ sentencing seemed to pose offenders of colour on a higher risk rate, as they were 45% more likely to receive a higher score than white offenders.[12]
On June 26th, 2017 the US Supreme Court rejected the petition to overturn the Wisconsin Supreme Court ruling.[13]
III. A commentary
The real value and the correct use of algorithmic risk assessments is a matter still open to debate. On the one hand, it has been proved that there is a 70% chance that any randomly selected higher-risk individual is classified as higher risk than a randomly selected low-risk individual.[14] On the other hand, it is also generally confirmed that risk assessment instruments can predict who is at risk to recidivate with at least some degree of accuracy within societies that are marked by a high degree of diversity – like that of the United States.[15]
In any case, a judgment that is only based upon factual data would be hard to deem as properly founded. The Supreme Court is reluctant to accept such studies and this stance is vividly conveyed in the following remark: “[…] proving broad sociological propositions by statistics . . . inevitably is in tension with the normative philosophy that underlies the Equal Protection Clause”.[16]
Consequently, it seems that a decision can only be found at fault, if it is essentially based on an algorithmic risk assessment –when such a tool is listed among other considerations and arguments, it could mostly impede rather than encourage arbitrary rulings. While a judge’s intuition, instinct and sense of justice is and has always been invaluable to the judiciary system, it can only be useful in certain cases to consider objective and solidly founded data to enhance the individualisation of a sentence –as long as past events are concerned, that is. Related to this point is the following remark: it is crucial that only risk assessments that disclose and properly explain the methodology applied should be used in support of a judicial argument. Despite the fact that these assessments have a long way to go before they could be integrated to the task of sentencing decisions, if methodological transparency is established so that the defendant is able to question their results, they could occasionally and within socially diverse societies be of true assistance to sentencing courts.
NOTES
[1] Kehl, Danielle, Guo, Priscilla, Kessler, Samuel. Algorithms in the Criminal Justice System: Assessing the Use of Risk Assessments in Sentencing. (July 2017). Responsive Communities. Available at: https://cyber.harvard.edu/ publications/2017/07/Algorithms.
[2] State v Loomis, No. 2015AP157–CR., 881 N.W.2d 749 (2016). Available at: https://www.wicourts.gov/sc/opinion/DisplayDocument.pdf?content=pdf&seqNo=171690
[3] See Malenchik v Indiana, 928 N.E.2d 564 (Ind. 2010).
[4] COMPAS standing for “Correctional Offender Management Profiling for Alternative Sanctions”, a tool broadly used in Wisconsin and elsewhere. Tim Brennan and al., Northpointe Inst. for Pub. Mgmt. Inc., Evaluating the Predictive Validity of the COMPAS Risk and Needs Assessment System, 36 CRIM. JUST & BEHAV. 21, 21 (2009). A sample can be found at https://www.documentcloud.org/documents/2702103-Sample-Risk-Assessment-COMPAS-CORE.html.
[5] The Due Process Clause appears concretely in the Fifth Amendment of the Constitution; however, the Supreme Court incorporated the Due Process Clause of the Constitution against the states via the Fourteenth Amendment. The clause protects citizens against the divestment of “life, liberty, or property, without due process of law.” See Katherine Freeman, Algorithmic Injustice: How the Wisconsin Supreme Court failed to protect Due Process Rights in State v Loomis, North Carolina Journal of Law and Technology, Vol. 18, December 2016.
[6] 881 N.W.2d 754 (Wis. 2016).
[7] 881 N.W.2d 766, 767 (Wis. 2016).
[8] 881 N.W.2d 768.
[9] For a more firm viewing on the value of such risk assessment tools, see Malenchik v State, 928 N.E. 2d 564, 574-575 (Ind. 2010).
[10] http://www.datacivilrights.org/pubs/2015-1027/Courts_and_Predictive_Algorithms.pdf
[11] 881 N.W.2d 775.
[12] Cecelia Klingele, The Promises and Perils of Evidence-Based Corrections, 91 NOTRE DAME L. REV. 537, 577 (2015). See also Jeff Larson, et al., How We Analyzed the COMPAS Recidivism Algorithm, PROPUBLICA (May 23, 2016), available at https://www.propublica.org/article/how-weanalyzed-the-compas-recidivism-algorithm.
[13] Loomis v. Wisconsin, No. 16-6387, 137 S.Ct. 2290 (2017). See however Brief of the United States as Curiae: “[…] the Wisconsin Supreme Court correctly declined to find a due process violation. But that is not to say that the use of actuarial risk assessments at sentencing will always be constitutionally sound. Some uses of an undisclosed risk-assessment algorithm might raise due process concerns—if, for example, a defendant is denied access to the factual inputs about his criminal and personal history, or if his risk scores form part of a sentencing “matrix” or establish a “presumptive” term of imprisonment”.
[14] Lightbourne John, Damned Lies and Criminal Sentencing Using Evidence-Based Tools, Duke Law and Technology Review, Vol. 15 No 1.
[15] Nathan James, Risk and Needs Assessment in the Criminal Justice System, Congressional Research Service, 13/10/2015.
[16] Craig v Boren, 429 U.S. 190 (1976), 204.