Wednesday, 11 November 2015

BBC Documentary co-presented by Norman Fenton wins AAAS Science Journalism Gold Award for "best in-depth TV reporting"

In March I reported on my experience of presenting the BBC documentary "Climate Change by Numbers". The programme has won the American Association for the Advancement of Science (AAAS) Science Journalism Gold Award for "best in-depth TV reporting". The summary citation says:
The Gold Award for in-depth television reporting went to a BBC team for a documentary that used clever analogies and appealing graphics to discuss three key numbers that help clarify important questions about the scale and pace of human influence on climate. The program featured a trio of mathematicians who use numbers to reveal patterns in data, assess risk, and help predict the future.
Jonathan Renouf Executive Producer at BBC Science said (to those involved in the making of the programme):
It’s a huge honour to win this award; it’s a global competition, open to programmes in every area of science, and it’s judged by science journalists. I can’t think of a finer and more prestigious endorsement of the research and journalistic rigour that you brought to bear in the film. We all know how difficult it is to make programmes about climate change that tread the line between entertainment, saying something new, and keeping the story journalistically watertight. I’m really thrilled to see your efforts recognised in top scientific circles.
Full details of the awards can be found on the AAAS website.

Friday, 6 November 2015

Update on the use of Bayes in the Netherlands Appeal Court

In July I reported about the so-called Breda 6 case in the Netherlands and how a Bayesian argument was presented in the review of the case. My own view was that the Bayesian argument was crying out for a Bayesian network representation (I provided a model in my article to do that).

Now Richard Gill has told me the following:
Finally there has been a verdict in the 'Breda 6' case. The suspects were (again) found guilty. The court is somewhat mixed with respect to the Bayesian analysis: On the one hand they ruled that Frans Alkmeye had the required expertise, and that he was rightly appointed as a 'Bayesian expert'. On the other hand they ruled that a Bayesian analysis is still too controversial to be used in court. Therefore they disregarded 'the conclusion' of Frans's report. This is a remarkable and unusual formulation in verdicts, the normal wording is that report has been disregarded.
This unusual wording is no accident: If the court would say that they had disregarded the report, they would lie, since actually quite a lot of the Bayesian reasoning is included in their judgment. A number of considerations from Frans's report are fully paraphrased, and sometimes quoted almost verbatim.
Also I noticed that the assessment of certain findings is expressed in a nicely Bayesian manner.
However: Contrary to Frans's assessment, the court still thinks that the original confessions of three of the suspects contain strong evidence. Unfortunately, the case is not yet closed, but has been taken to the high court.
Frans Alkmeye has also been appointed as a Bayesian expert in yet another criminal case.

The ruling that the Bayesian analysis is too controversial is especially disappointing since we have recently been in workshops with Dutch judges who are very keen to use Bayesian reasoning - and even Bayesian networks (in the Netherlands there are no juries so the judges really do have to make the decisions themselves). These judges - along with Frans Alkemade - will be among many of the world's top lawyers, legal scholars, forensic scientists, and mathematicians participating in the Isaac Newton Institute Cambridge Programme on Probability and Statistics in Forensic Science that will take place July-Dec 2016. This is a programme that I have organised along with David Lagnado, David Balding, Richard Gill and Leila Schneps. It derives from our Bayes and the Law consortium which states that, despite the obvious benefits of using Bayes:

The use of Bayesian reasoning in investigative and evaluative forensic science and the law is, however, the subject of much confusion. It is deployed in the adduction of DNA evidence, but expert witnesses and lawyers struggle to articulate the underlying assumptions and results of Bayesian reasoning in a way that is understandable to lay people. The extent to which Bayesian reasoning could benefit the justice system by being deployed more widely, and how it is best presented, is unclear and requires clarification.
One of the core objectives of the 6-month programme is to address this issue thoroughly. Within the programme there are three scheduled workshops:
  1. "The nature of questions arising in court that can be addressed via probability and statistical methods", Tue 30th Aug 2016 - Tue 30th Aug 2016
  2. "Bayesian networks in evidence analysis", Mon 26th Sep 2016 - Thurs 29th Sep 2016
  3. "Statistical methods in DNA analysis and analysis of trace evidence", Mon 7th Nov 2016 - Mon 7th Nov 2016

Monday, 26 October 2015

Cyber security risk of nuclear facilities using Bayesian networks

Scientists from Korea (Jinsoo Shin, Hanseong Son, Rahman Khalilur, and Gyunyoung Heo) have published an article describing their Bayesian network model for assessing cyber security risk of nuclear facilities (using the AgenaRisk tool). It is based on combining two models - one which is process based (considers how well security procedures were followed) and the other which is considers the system architecture (considering vulnerabilities and controls). The full paper is here:

Shin, J., Son, H., Khalil ur, R., & Heo, G. (2015). Development of a cyber security risk model using Bayesian networks. Reliability Engineering & System Safety, 134, 208–217. doi:10.1016/j.ress.2014.10.006

Bayesian Networks for Risk Assessment of Public Safety and Security Mobile Service

A new paper by Matti Peltola and Pekka Kekolahti of the Aalto University (School of Electrical Engineering) in Finland uses Bayesian Networks and the AgenaRisk tool to gain a deeper understanding of the availability of Public Safety and Security (PSS) mobile networks and their service under different conditions. The paper abstract states:
A deeper understanding of the availability of Public Safety and Security (PSS) mobile networks and their service under different conditions offers decision makers guidelines on the level of investments required and the directions to take in order to decrease the risks identified. In the study, a risk assessment model for the existing PSS mobile service is implemented for both a dedicated TETRA PSS mobile network as well as for a commercial 2G/3G mobile network operating under the current risk conditions. The probabilistic risk assessment is carried out by constructing a Bayesian Network. According to the analysis, the availability of the dedicated Finnish PSS mobile service is 99.1%. Based on the risk assessment and sensitivity analysis conducted, the most effective elements for decreasing availability risks would be duplication of the transmission links, backup of the power supply and real-time mobile traffic monitoring. With the adjustment of these key control variables, the service availability can be improved up to the level of 99.9%. The investments needed to improve the availability of the PSS mobile service from 99.1 % to 99.9% are profitable only in highly populated areas. The calculated availability of the PSS mobile service based on a purely commercial network is 98.8%. The adoption of a Bayesian Network as a risk assessment method is demonstrated to be a useful way of documenting different expert knowledge as a common belief about the risks, their magnitudes and their effects upon a Finnish PSS mobile service.
Full reference details:
Peltola, M. J., & Kekolahti, P. (2015). Risk Assessment of Public Safety and Security Mobile Service. In 2015 10th International Conference on Availability, Reliability and Security (pp. 351–359). IEEE. doi:10.1109/ARES.2015.65

Sunday, 18 October 2015

What is the value of missing information when assessing decisions that involve actions for intervention?

This is a summary of the following new paper:

Constantinou AC, Yet B, Fenton N, Neil M, Marsh W  "Value of Information analysis for interventional and counterfactual Bayesian networks in forensic medical sciences". Artif Intell Med. 2015 Sep 8  doi:10.1016/j.artmed.2015.09.002. The full pre-publication version can be found here.

Most decision support models in the medical domain provide a prediction about a single key unknown variable, such as whether a patient exhibiting certain symptoms is likely to have (or develop) a particular disease.

However we seek to enhance decision analysis by determining whether a decision based on such a prediction could be subject to amendments on the basis of some incomplete information within the model, and whether it would be worthwhile for the decision maker to seek further information prior to the decision. In particular we wish to incorporate interventional actions and counterfactual analysis, where:
  • An interventional action is one that can be performed to manipulate the effect of some desirable future outcome. In medical decision analysis, an intervention is typically represented by some treatment, which can affect a patient’s health outcome.
  • Counterfactual analysis enables decision makers to compare the observed results in the real world to those of a hypothetical world; what actually happened and what would have happened under some different scenario.
The method we use is based on the underlying principle of Value of Information. This is a technique initially proposed in economics for the purposes of determining the amount a decision maker would be willing to pay for further information that is currently unknown within the model.

The type of predictive decision support models to which our work applies are Bayesian networks. These are graphical models which represent the causal or influential relationships between a set of variables and which provide probabilities for each unknown variable.

The method is applied to two real-world Bayesian network models that were previously developed for decision support in forensic medical sciences. In these models a decision maker (such as a probation officer or a clinician) has to determine whether to release a prisoner/patient based on the probability of the (unknown) hypothesis variable: “individual violently reoffends after release”. Prior to deciding on release, the decision maker has the option to simulate various interventions to determine whether an individual’s risk of violence can be managed to acceptable levels. Additionally, the decision maker may have the option to gather further information about the individual. It is possible that knowing one or more of these unobserved factors may lead to a different decision about release.

We used the method to examine the average information gain; that is, what we learn about the importance of the factors that remain unknown within the model. Based on six different sets of experiments with various assumptions we show that:
  1. the average relative percentage gain in terms of Value of Information ranged between 11.45% and 59.91% (where a gain of X% indicates an expected X% relative reduction of the risk of violent reoffence);
  1. the potential amendments in Decision Making, as a result of the expected information gain, ranged from 0% to 86.8% (where an amendment of X% indicates that X% of the initial decisions are expected to have been altered).
The key concept of the method is that if we had known that the individual was, for example, a substance misuser, we would have arranged for a suitable treatment; whereas without having information about substance misuse it is impossible to arrange such a treatment and, thus, we risk not treating the individual in the case where he or she is a substance misuser.

The method becomes useful for decision makers, not only when decision making is subject to amendments on the basis of some unknown risk factors, but also when it is not. Knowing that a decision outcome is independent of one or more unknown risk factors saves us from seeking information about that particular set of risk factors.

This summary can also be found on the Atlas of Science

Thursday, 15 October 2015

Talk: Bayesian networks: why smart data is better than big data

Bayesian networks: why smart data is better than big data
by Prof. Norman Fenton from the School of Electronic Engineering and Computer Science (QMUL)
WHEN: Fri, 16th October 2 - 3 pm
WHERE: People's Palace PP2 (Mile End Campus)

"This talk will provide an introduction to Bayesian networks which, due to relatively recent algorithmic breakthroughs, has become an increasingly popular technique for risk assessment and decision analysis. I will provide an overview of successful applications (including transport safety, medical, law/forensics, operational risk, and football prediction). What is common to all of these applications is that the Bayesian network models are built using a combination of expert judgment and (often very limited) data. I will explain why Bayesian networks ‘learnt’ purely from data – even when ‘big data’ is available - generally do not work well."

All are welcome. The seminar consists of an app. 45 min long lecture and discussion.
In case of any questions, feel free to contact me.
Hope to see you tomorrow,

Judit Petervari
Judit Petervari
PhD Student

Biological and Experimental Psychology Group
School of Biological and Chemical Sciences
Queen Mary University of London
Mile End Road
E1 4NS London
United Kingdom

Office: G.E. Fogg Building, Room 2.16

Friday, 2 October 2015

Beware a 'journal' called FSS (Forensic Science Seminars): publishing papers without authors' permission

Screenshot of our unpublished draft paper that somehow got published in the 'journal' FSS
I have previously reported here and here on cases of our work being plagiarised in the most brazen way. Now comes a more unusual case - our unpublished work has been published in a 'journal' without our permission. And it seems the journal's articles may all be obtained in this way.

A forensic scientist in the Netherlands contacted me this week to say that he had found a copy of one of his papers in a journal he had never heard of, namely ‘Forensic Science Seminar’. In the same journal, he found this article by us. In fact, that article is an exact copy of this unpublished draft article that appears on my website. The only difference is that the article on my website (which is dated Jan 2012) says the following very clearly on the front:
Much of the work in this unpublished draft paper has subsequently been published in the following (which should be cited):
  • Fenton, N. E., D. Berger, D. Lagnado, M. Neil and A. Hsu, (2014). "When ‘neutral’ evidence still has probative value (with implications from the Barry George Case)", Science and Justice, 54(4), 274-287 
  • Fenton, N. E., Neil, M., & Hsu, A. (2014). "Calculating and understanding the value of any type of match evidence when there are potential testing errors". Artificial Intelligence and Law, 22. 1-28 . 
The 'journal' is Printed and Published by "ZolCat® Academic House". Their other titles include:"Science and Nature", "Frontiers of Engineering Journal", "Journal of Computer Sciences".