The synthesis of various information sources, including a priori domain knowledge, statistical assumptions, field data, etc., large-scale numerical models is one of the key steps in building interpretable and predictive models for supporting critical decisions in science, engineering, medicine, and beyond. Typical examples can be found in oil/gas reservoir modeling, treatment of saltwater intrusion, medical imaging, tumor treatment, aircraft design. Because of the computationally costly nature of the numerical models and stringent requirements on the accuracy of the statistical learning outcomes, multilevel and multi-fidelity methods provide a viable route for solving these model-based statistical learning tasks. This mini-symposium will bring together researchers working on the forefront of multilevel and multi-fidelity methods (and other relevant methods) intended to accelerate model-based statistical learning tasks.
14:00
- NEW - Data Driven Multiscale Methods for Bayesian Inverse Problems based on Large Scale Partial Differential Equations
Tim Dodwell | University of Exeter | United Kingdom
Show details
Authors:
Tim Dodwell | University of Exeter | United Kingdom
Robert Scheichl | Ruprecht-Karls University Heidelberg | Germany
For Markov Chain Monte Methods in Bayesian inverse problems of complex Partial Differential Equations (PDEs) a significant bottleneck is the evaluation of the likelihood. This requires repeated solves of often large systems of equations. Multilevel Strategies have been developed to accelerate inversion by combining high fidelity solves with coarser discretisations of the PDE. This has been shown to significantly accelerate inference. However, for some interesting problems, particular those in which the PDE exhibits variations in parameters which are both high contrast and vary on an intermediate length scale, the ability to construct sufficiently coarse models to achieve practical gains can be challenging.
14:30
Multilevel MCMC as a Surrogate Transition Method
Colin Fox | University of Otago | New Zealand
Show details
Authors:
Colin Fox | University of Otago | New Zealand
Robert Scheichl | Heidelberg University | Germany
The Multilevel MCMC (MLMCMC) of Dodwell et al. (SIAM/ASA JUQ 2013, SIGEST 2019) relies on independence of samples at the coarse level to ensure detailed balance and ergodicity of the resulting algorithm with respect to the desired target distribution, and hence is correct only in the limit of infinitely long subsampling rates (cf. Alg. 2, Lemma 3.1, Dodwell et al, 2013). An apparently minor modification the MLMCMC algorithm allows it to be written as a small extension of the surrogate transition method of Liu (Sec 9.4.3, Monte Carlo Strategies in Scientific Computing, 2001), and hence ensure ergodicity with respect to the correct target for any finite subsampling rate. A bonus is that the rate of convergence of estimates is also improved. We present that modification and a proof of detailed balance, from which ergodicity follows by standard results.
15:00
Multilevel Best Linear Unbiased Estimators
Daniel Schaden | Technical University of Munich | Germany
Show details
Authors:
Daniel Schaden | Technical University of Munich | Germany
Elisabeth Ullmann | Technical University of Munich | Germany
We present a general variance reduction technique to accelerate the estimation of an expectation of a scalar valued quantity of interest. We reformulate the estimation as a linear regression problem and show that the derived estimator, which we call Sample Allocation Optimal Best Linear Unbiased Estimator, is asymptotically optimal within a certain class of linear estimators. We further derive an upper bound on the asymptotic complexity of this estimator for parametric models by the use of Richardson extrapolation showing improvements upon other sampling based estimation techniques like Monte Carlo, Multilevel Monte Carlo and Multifidelity Monte Carlo. We illustrate the results using a numerical example where the underlying model of the quantity of interest is a partial differential equation.
Download presentation
15:30
- NEW - Unbiased inference for discretely observed hidden Markov model diffusions
Jordan Franks | Newcastle University | United Kingdom
Show details
Authors:
Jordan Franks | Newcastle University | United Kingdom
Ajay Jasra | King Abdullah University of Science and Technology (KAUST) | Saudi Arabia
Kody Law | University of Manchester | United Kingdom
Matti Vihola | University of Jyvaskyla | Finland
Diffusion processes observed with noise in the real world represent challenging models for Bayesian parameter inference in the presence of non-linearity, non-Gaussianity, and multi-dimensionality. We propose a general inference approach for diffusion processes observed with noise, thus opening up the possibility of inference for such challenging models. The algorithm is based on a clever combination of popular algorithms, such as pseudo-marginal and adaptive Markov chain Monte Carlo, Euler-Maruyama discretisations, particle filters and coupling, and multi-level Monte Carlo. The algorithm is therefore generally programmable, and we give a small set of conditions under which the approach is guaranteed to deliver unbiased model inference. We also consider the efficiency of the approach within the multi-level framework. We then apply our algorithm on some example models.