Wednesday 22 May 2019

Defining the dreaded 'prior probability of guilt' - a new paper that does just that


One of the greatest impediments to the use of probabilistic reasoning in legal arguments is the difficulty in agreeing on an appropriate prior probability that the defendant is guilty. The 'innocent until proven guilty' assumption technically means a prior probability of 0 - a figure that (by Bayesian reasoning) can never be overturned no matter how much evidence follows. Some have suggested the logical equivalent of 1/N where N is the number of people in the world. But this probability is clearly too low as N includes too many who could not physically have committed the crime. On the other hand the often suggested prior 0.5 is too high as it stacks the odds too much against the defendant.

Therefore, even strong supporters of a Bayesian approach seem to think they can and must ignore the need to consider a  prior probability of guilt (indeed it is this thinking that explains the prominence of the 'likelihood ratio' approach discussed so often on this blog).

This new paper published online in the OUP journal Law, Probability and Risk (and which extends a previous paper presented at the 2017 International Conference on Artificial Intelligence and the Law) - shows that, in a large class of cases, it is possible to arrive at a realistic prior that is also as consistent as possible with the legal notion of ‘innocent until proven guilty’. The approach is based first on identifying the 'smallest' time and location from the actual crime scene within which the defendant was definitely present and then estimating the number of people - other than the suspect - who were also within this time/area. If there were n people in total, then before any other evidence is considered each person, including the suspect, has an equal prior probability 1/n of having carried out the crime.

The method applies to cases where we assume a crime has definitely taken place and that it was committed by one person against one other person (e.g. murder, assault, robbery). The work considers both the practical and legal implications of the approach and demonstrates how the prior probability is naturally incorporated into a generic Bayesian network model that allows us to integrate other evidence about the case.

Full details:
Fenton, N. E., Lagnado, D. A., Dahlman, C., & Neil, M. (2019). "The Opportunity Prior: A proof-based prior for criminal cases", Law, Probability and Risk, DOI 10.1093/lpr/mgz007

Monday 13 May 2019

When 'absence of forensic evidence' is not 'neutral'


It is widely accepted that ‘evidence of absence’ (such as an alibi confirming that the defendant was not at the crime scene) is not the same as ‘absence of evidence’ (such as where there is no evidence about whether or not the defendant was at the crime scene).

However, for forensic evidence, there is often confusion about these concepts. If DNA found at the crime scene does not match the defendant is that ‘evidence of absence’ or ‘absence of evidence’? It depends, of course, on the circumstances. If there is a high probability that the DNA found must have come from the person who committed the crime then this is clearly ‘evidence of absence’ - the fact that it does not match the defendant is highly probative in favour of the defence. On the other hand if the only DNA found at the crime scene is actually unrelated to the person who committed the crime, then this is clearly ‘absence of evidence’ – the fact that it does not match the defendant is no more probative for the defence than for the prosecution (so the evidence is ‘neutral’). The problem is that lawyers and forensic scientists often wrongly assume that absence ‘evidence of absence’ is ‘neutral’.

The full report (5 pages) includes a 'proof'  (using a simple Bayesian network model) of how the experts get it wrong in a real example.
Fenton, N. E. (2019). When “absence of forensic evidence” is not “neutral.” https://doi.org/10.13140/RG.2.2.14517.73440

The Bayesian network model is available here. It can be run in the trial version of AgenaRisk

Wednesday 1 May 2019

House of Lords Report on Forensic Science and the Criminal Justice System


The House of Lords Report published today contains the following quote from me that was part of The Alan Turing Institute submission:



I said a lot more in the Turing submission about the use of probability and statistics in evidence, including concerns about low template DNA evidence and the possibility of using Bayesian networks to properly assess the overall impact of multiple pieces of related evidence.

Two other Queen Mary colleagues (Amber Marks and Ian Walden) also contributed to the Turing submission. 

For full details see:

  • House of Lords, The Science and Technology Select Committee "Forensic science and the criminal justice system: a blueprint for change" HL Paper 333, 1 May 2019, https://t.co/M6utVY8Z0b 
  • The Alan Turing Institute "Response to the House of Lords inquiry: Forensic Science in Criminal Justice", 13 September 2018, https://t.co/OBNeceVqhu
  • Fenton N.E, Neil M, Berger D, “Bayes and the Law”, Annual Review of Statistics and Its Application, Volume 3, 2016 (June), pp 51-77 http://dx.doi.org/10.1146/annurev-statistics-041715-033428. (This is cited in both of the above reports. See also blog posting about this article).