Friday, 19 January 2018

Criminally Incompetent Academic Misinterpretation of Criminal Data - and how the Media Pushed the Fake News


On 17 Jan 2018 multiple news sources (e.g. see here, here, and here) ran a story about a new research paper ‎ that claims to expose both the inaccuracies and racial bias in one of the most common algorithms used by parole boards to predict recidivism (i.e. whether or not a defendant will re-offend).

The research paper was written by the world famous computer scientist Hany Farid (along with a student Julia Dressel).

But the real story here is that the paper’s accusation of racial bias (specifically that the algorithm is biased against black people) is based on a fundamental misunderstanding of causation and statistics. The algorithm is no more ‘biased’ against black people than it is biased against white single parents, ‎ old people, people living in Beattyville Kentucky, or women called ‘Amber’. In fact, as we show in this brief article, if you choose any factor that correlates with poverty you will inevitably replicate the statistical ‘bias’ claimed in the paper. And if you accept the validity of the claims in the paper then you must also accept, for example, that a charity which uses poverty as a factor to identify and help homeless people is being racist because it is biased against white people (and also, interestingly, Indian Americans).

The fact that the article was published and that none of the media running the story realise that they are pushing fake news is what is most important here. Depressingly, many similar research studies involving the same kind of misinterpretation of statistics result in popular media articles that push a false narrative of one kind or another.

Our article (5 pages): Fenton, N.E., & Neil, M. (2018). "Criminally Incompetent Academic Misinterpretation of Criminal Data - and how the Media Pushed the Fake News" http://dx.doi.org/10.13140/RG.2.2.32052.55680  Also available here.

The research paper: Dressel, J. & Farid, H. The accuracy, fairness, and limits of predicting recidivism. Sci. Adv. 4, eaao5580 (2018). 


See some previous articles on poor use of statistics:

Thursday, 11 January 2018

On lawnmowers and terrorists again: the danger of using historical data alone for decision-making

The short paper and blog posting we did last week generated a lot of interest, especially after Nicholas Taleb retweeted it. An edited version (along with a response from a representative of the Royal Statistical Society) is going to appear in the February issue of Significance magazine (which is the magazine of the RSS and the American Statistical Association). In the mean time we have produced another short paper that explores further problems with the 'lawnmower versus terrorist risk' statistics - in particular the inevitable limitations and dangers of relying on historical data alone for risk assessment:
Fenton, N.E., & Neil, M. (2018). "Is decision-making using historical data alone more dangerous than lawnmowers?", Open Access Report DOI:10.13140/RG.2.2.20914.71363. Also available here.

Wednesday, 3 January 2018

Are lawnmowers a greater risk than terrorists?

Kim Kardashian, whose tweet comparing the threats of lawnmowers and terrorists led to RSS acclaim
In December 2017 the Royal Statistical Society (RSS) announced the winner of its “International Statistic of the Year”. The statistic was simply "69" which it said was "the annual number of Americans killed, on average, by lawnmowers - compared to two Americans killed annually, on average, by immigrant Jihadist terrorists".  The full RSS citation says that the statistic tweeted by Kim Kardashian ‘highlights misunderstandings of risk’ and ‘illuminates the bigger picture’. Unfortunately, we believe it does exactly opposite as we explain in this brief paper:
Fenton, N.E., & Neil, M. (2018). "Are lawnmowers a greater risk than terrorists?" Open Access Report DOI:10.13140/RG.2.2.34461.00486/1 
As you can see from the tweet by Taleb, this use of statistics for risk assessment was not universally welcomed.


See update to this story here.