Tuesday 18 October 2016

A2 Crime & Deviance - COMPAS: Correctional Offender Management Profiling for Alternative Sanctions

Prisoner Eric Loomis challenged the assessment which led to his incarceration

In the USA, in an attempt to remove human bias from decision-making procedures such as granting bail; sentencing length; granting parole etc., a 'machine selection' procedure called COMPAS (Correctional Offender Management Profiling for Alternative Sanctions) was introduced whereby these decisions are made based on the results of an algorithm.



Offenders are given a questionnaire to complete, and the results are supplied to a private company called Northpointe Inc., who run the results through a proprietary computer algorithm to produce a 'risk of future offending' score from one to ten. The higher the score, the more likely that the offender is an "individual who is at high risk to the community" and should therefore be given a longer sentence, or refused bail or parole.

The algorithm is a secret however, both to criminals and even to law enforcement and criminal justice personnel, so that nobody outside of Northpointe Inc. can be sure just how the risk scores are created.

This has led to one convicted offender challenging his sentencing. Eric Loomis argued that the score that he was given and which his sentence was based upon, was incorrect. You can read more about his case here: http://www.bbc.co.uk/news/magazine-37658374

You can read what ProPublica concluded when they analysed the data here: https://www.propublica.org/article/how-we-analyzed-the-compas-recidivism-algorithm

You can read about ProPublica's conclusions of racial bias in the supposedly neutral algorithm here: https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing

Find out what EPIC (Electronic Privacy Information Center) had to say in general about machine algorithm bias here: https://epic.org/algorithmic-transparency/crim-justice/

This is what the Wisconsin Department of Corrections, who dealt with Eric Loomis' case, have to say about the use of COMPAS: http://doc.wi.gov/about/doc-overview/office-of-the-secretary/office-of-reentry/compas-assessment-tool

Danielle Citron, a writer on privacy. civil rights and automated systems, comments here in Forbes magazine: http://www.forbes.com/sites/daniellecitron/2016/07/13/unfairness-of-risk-scores-in-criminal-sentencing/#5f21ff984479

And finally, you can read about what Northpointe Inc. had to say about ProPublica's analysis of their COMPAS system here: http://www.northpointeinc.com/northpointe-analysis

And also what an independent group of academics have to say about this issue in their draft paper: "False Positives, False Negatives, and False Analyses: A Rejoinder to “Machine Bias: There’s Software Used Across the Country to Predict Future Criminals. And it’s Biased Against Blacks.”" here: http://www.crj.org/page/-/publications/rejoinder7.11.pdf


You can see an example of an actual COMPAS questionnaire here https://www.documentcloud.org/documents/2702103-Sample-Risk-Assessment-COMPAS-CORE.html  or in the embedded frame below:


No comments:

Post a Comment