Mark Girolami is an EPSRC Established Career Research Fellow (2012–2018) and was previously an EPSRC Advanced Research Fellow (2007–2012). He is the Director of the Lloyds Register Foundation Turing Programme on Data Centric Engineering, and previously led the EPSRC-funded Research Network on Computational Statistics and Machine Learning which is now a Section of the UK’s Royal Statistical Society. In 2011 he was elected Fellow of the Royal Society of Edinburgh and also awarded a Royal Society Wolfson Research Merit Award. He was one of the founding Executive Directors of the Alan Turing Institute for Data Science from 2015 to 2016, before taking leadership of the Data Centric Engineering Programme at The Alan Turing Institute. His paper on Riemann Manifold Hamiltonian Monte Carlo methods was Read before the Royal Statistical Society, receiving the largest number of contributed discussions of a paper in its 183-year history.
Mark Girolami’s Medallion lecture will be given at the 2017 Joint Statistical Meetings in Baltimore (July 29–August 4, 2017). See the online program at http://ww2.amstat.org/meetings/jsm/2017/onlineprogram/index.cfm
Probabilistic Numerical Computation: a Role for Statisticians in Numerical Analysis?
Consider the consequences of an alternative history. What if Leonhard Euler had happened to read the posthumous publication of the paper by Thomas Bayes on “An Essay towards solving a Problem in the Doctrine of Chances”? This paper was published in 1763 in the Philosophical Transactions of the Royal Society, so if Euler had read this article, we can wonder whether the section in his three volume book Institutionum calculi integralis, published in 1768, on numerical solution of differential equations might have been quite different.
Would the awareness by Euler of the “Bayesian” proposition of characterising uncertainty due to unknown quantities using the probability calculus have changed the development of numerical methods and their analysis to one that is more inherently statistical?
Fast forward the clock two centuries to the late 1960s in America, when the mathematician F.M. Larkin published a series of papers on the definition of Gaussian Measures in infinite dimensional Hilbert spaces, culminating in the 1972 work on “Gaussian Measure on Hilbert Space and Applications in Numerical Analysis”. In that work the formal definition of the mathematical tools required to consider average case errors in Hilbert spaces for numerical analysis were laid down and methods such as Bayesian Quadrature or Bayesian Monte Carlo were developed in full, long before their independent reinvention in the 1990s and 2000s brought them to a wider audience.
Now in 2017 the question of viewing numerical analysis as a problem of Statistical Inference in many ways seems natural and is being demanded by applied mathematicians, engineers and physicists who need to carefully and fully account for all sources of uncertainty in mathematical modelling and numerical simulation.
Now we have a research frontier that has emerged in scientific computation founded on the principle that error in numerical methods, which for example solves differential equations, entails uncertainty that ought to be subjected to statistical analysis. This viewpoint raises exciting challenges for contemporary statistical and numerical analysis, including the design of statistical methods that enable the coherent propagation of probability measures through a computational and inferential pipeline.
Comments on “Medallion Lecture preview: Mark Girolami”