Thursday 28 June 2018

Guilty Until Proven Innocent: The Crisis in Our Justice System

As mentioned in my previous posting I was invited by Jon Robins (the Justice Gap) to speak at the third meeting of the All-Party Parliamentary Group on Miscarriages of Justice, hosted by Barry Sheerman MP, in the House of Commons on 25 June 2018. The meeting was based around the launch of Jon Robins' outstanding new book "Guilty Until Proven Innocent: The Crisis in Our Justice System".  Other speakers were: Michael Mansfield QC and lawyer Matt Foot who have been involved in many of the cases described in the book; Waney Squier the world-renowned neuropathologist who suffered for being one of the few medical experts to question the mainstream medical guidelines on 'shaken baby syndrome'; Gloria Morrison who spoke about the problems of Joint Enterprise relevant to some of the cases; and Liam Allan and Eddie Gilfoyle who spoke about their own experiences (theirs are are two of the cases discussed in the book). It was a very powerful and informative meeting which was very well attended (with many having to stand for the full two hours)

I  have now written a detailed review of the book (also available here) which includes more about the House of Commons meeting.

See also

Monday 25 June 2018

On the Role of Statistics in Miscarriages of Justice

I have been invited by Jon Robins (the Justice Gap) to speak today at the third meeting of the All-Party Parliamentary Group on Miscarriages of Justice, hosted by Barry Sheerman MP, in the House of Commons. Jon Robins will be talking about his outstanding new book "Guilty Until Proven Innocent: The Crisis in Our Justice System" at the event. The book includes a description of the Ben Geen case for which I provided a report to the Criminal Cases Review Commission in 2015 showing that the sequence of 'unusual events' at the Horton General Hospital (where Ben Geen worked as a nurse) was not especially unusual.

My short talk today focuses on the role of statistics in miscarriages of justice. A transcript of the talk can be found here.

Norman Fenton

See also

Friday 22 June 2018

Bias in AI Algorithms

This is an update of a posting originally made on  18 Jan 2018 (see below for the update)

On 17 Jan 2018 multiple news sources (e.g. see here, here, and here) ran a story about a new research paper ‎ that claims to expose both the inaccuracies and racial bias in COMPAS - one of the most common algorithms used for parole and sentencing decisions to predict recidivism (i.e. whether or not a defendant will re-offend).

The research paper was written by the world famous computer scientist Hany Farid (along with a student Julia Dressel).

But the real story here is that the paper’s accusation of racial bias (specifically that the algorithm is biased against black people) is based on a fundamental misunderstanding of causation and statistics. The algorithm is no more ‘biased’ against black people than it is biased against white single parents, ‎ old people, people living in Beattyville Kentucky, or women called ‘Amber’. In fact, as we show in this brief article, if you choose any factor that correlates with poverty you will inevitably replicate the statistical ‘bias’ claimed in the paper. And if you accept the validity of the claims in the paper then you must also accept, for example, that a charity which uses poverty as a factor to identify and help homeless people is being racist because it is biased against white people (and also, interestingly, Indian Americans).

The fact that the article was published and that none of the media running the story realise that they are pushing fake news is what is most important here. Depressingly, many similar research studies involving the same kind of misinterpretation of statistics result in popular media articles that push a false narrative of one kind or another.

22 June 2018 Update: It turns out that now Microsoft is "developing a tool to help engineers catch bias in algorithms" This article also cites the case of the COMPAS software:
 "...., which uses machine learning to predict whether a defendant will commit future crimes, was found to judge black defendants more harshly than white defendants." 
Interestingly, this latest news article about Microsoft does NOT refer to the 2018 Dressel and Fardi article but, rather, to an earlier 2016 article by Larson et al: From a quick inspection it does seem to be a more comprehensive study than the flawed Dressel and Farid article. But my quick impression is that the same fundamental misunderstandings statistics/causality are there. Given the great degree of interest in AI/bias, and given also that we were unaware of the 2016 study, we plan to do an update to our unpublished paper.

Our article (5 pages): Fenton, N.E., & Neil, M. (2018). "Criminally Incompetent Academic Misinterpretation of Criminal Data - and how the Media Pushed the Fake News"  Also available here.

The research paper: Dressel, J. & Farid, H. The accuracy, fairness, and limits of predicting recidivism. Sci. Adv. 4, eaao5580 (2018). 

Thanks to Scott McLachlan for the tip off on this story.

See some previous articles on poor use of statistics:

Wednesday 20 June 2018

New project: Bayesian Artificial Intelligence for Decision Making under Uncertainty

Anthony Constantinou - a lecturer based in the Risk and Information Management Group at Queen Mary University of London - has been awarded a prestigious 3-year EPSRC Fellowship Grant £475,818 in partnership with Agena Ltd to develop open-source software that will enable end-users to quickly and efficiently generate Bayesian Decision Networks (BDNs) for optimal real-world decision-making. BDNs are Bayesian Networks augmented with additional functionality and knowledge-based assumptions to represent decisions and associated utilities that a decision maker would like to optimize. BDNs are suitable for modelling real-world situations where we seek to discover the optimal decision path to maximise utilities of interest and minimise undesirable risk.

A full description of the project can be found here. The EPSRC announcement is here.