Probabilistic numerical methods (PNM) explicitly model uncertainty arising in numerical computations. Early foundations to the field reach back more than a century. However, only in the past few decades, with ever-increasing computing resources and data volume, have the needs and tools for uncertainty-aware computation arisen. Uncertainty quantification for numerical methods is promising if not essential in various modern scientific computing tasks. For example, large physically relevant PDE systems cannot be affordably solved to arbitrary accuracy, and contemporary Big Data applications give rise to stochastic problems that classical methods fail to solve, calling for new algorithms. A probabilistic treatment of numerical methods can further allow for adaptive methods and problem-specific decision-making in expensive models that do not allow for many realizations. A yet understudied challenge to be addressed in future PNM is the treatment of numerical uncertainty in pipelines of computations. These directions require a thorough theoretical understanding as well as easy-to-apply black-box algorithms for the practitioner. This minisymposium takes a holistic approach to PNM and serves to survey recent advances and establish future research directions.
Probabilistic numerics: History and recent trends
Tim Sullivan | Free University of Berlin | Germany
The interplay of probability and numerical analysis is a topic with a long and interesting history, reaching back to Poincaré. In the computer age, probabilistic perspectives on numerical tasks such as linear algebra, quadrature, optimisation, and the solution of differential equations have enjoyed recurrent waves of interest and are now experiencing renewed interest under the name "probabilistic numerics". This talk will give a general overview of the history of the field and its relationship to other paradigms such as average-case analysis and information-based complexity, thereby placing it in a rounded byt modern context. There will also be a survey of recent research results and future directions for the field, including applications to statistical inverse problems and scientific computing more generally.
It is time to take Uncertainty Seriously
Philipp Hennig | University of Tübingen & Max Planck Institute for Intelligent Systems | Germany
In contemporary numerical tasks, the presence of Big Data can make classic numerical methods shockingly ineffective. A particularly drastic example is deep learning, a non-convex and highly stochastic problem class in which all the good old optimizers simply do not work, and practitioners are forced to resort to fiddling around with the parameters of (variants of) gradient descent. Several paradigms of classic numerics is called into question by this development, but it also offers exciting opportunities for big impact with new algorithms. I will argue that probabilistic numerics offers concepts and functionality that make it the most promising formalism to address this challenge, and illustrate this by a number of recent results. I will also discuss an urgent need for theorists and applied mathematicians to stay on top of rapidly developing fields, and present software packages that can help achieve this.
Fast Bayesian Inference for Differential Equations Using Probabilistic Numerical Methods
Chris Oates | Newcastle University | United Kingdom
This talk considers time-evolving inverse problems that are solved with Bayesian statistical methods. In particular, we extend Bayesian methods to incorporate statistical models for the error that is incurred in the numerical solution of the physical governing equations. This enables full uncertainty quantification within a principled computation-precision trade-off, in contrast to the over-confident inferences that are obtained when all sources of numerical error are ignored. An application to industrial hydrocyclone equipment serves to illustrate the approach.
Active Multi-Source Bayesian Quadrature for Expensive Simulations
Maren Mahsereci | Amazon | United Kingdom
Bayesian quadrature (BQ) is a sample-efficient probabilistic numerical method to solve integrals of expensive-to-evaluate black-box functions, yet so far, active BQ focuses merely on the integrand itself as information source, and does not allow for information transfer from cheaper, related functions. We explore active learning in BQ when multiple related information sources of variable cost (in input and source) are accessible. This setting arises for example when evaluating the integrand requires a complex simulation to be run that can be approximated by simulating at lower levels of sophistication and at lesser expense. We construct meaningful cost-sensitive multi-source acquisition rates as an extension to common utility functions from vanilla BQ, and discuss pitfalls that arise from blindly generalizing.