Markov chain Monte Carlo (MCMC) methods have become a well-established tool for approximate sampling in computational statistics and are indespensible for the quantification of uncertainty. One of the main advantages of MCMC approaches, shown in numerical experiments and proven theoretically, is the robust behavior w.r.t. the dimension, such that in high-dimensional scenarios those are often the method of choice. In the recent years there have been several new advances in algorithmic as well as theoretical aspects of these methods. For instance, Wasserstein contraction in order to prove ergodicity of Markov chains, MCMC methods for inference on non-Euclidean spaces such as Riemannian manifolds, and efficient Metropolis-Hastings algorithms for highly concentrated target measures.
The goal of this session is to discuss these recent developments and the dimension dependence of MCMC from a theoretical as well as a practical point of view.
16:30
Rapid mixing of geodesic walks on manifolds with positive curvature
Aaron Smith | University of Ottawa | Canada
Show details
Author:
Aaron Smith | University of Ottawa | Canada
We introduce a random walk for sampling from the uniform distribution on a Riemannian manifold. In the case that the manifold has positive curvature that is bounded away from 0 and infinity, we find tight bounds on the mixing time of this Markov chain in terms of the upper and lower bounds on the curvature of the manifold. Interestingly, these bounds are dimension-independent. After presenting the main results and a quick proof sketch, we will discuss practical implementation issues, including how to approximate the true geodesics that appear in the definition of the Markov chain and how to use our walk to sample from spaces (such as convex polytopes) that are not quite smooth manifolds. Time permitting, we will briefly discuss technical issues that appear when moving beyond the case of positive curvature.
17:00
Quantitative convergence properties of slice sampling
Viacheslav Natarovskii | Georg-August University of Göttingen | Germany
Show details
Author:
Viacheslav Natarovskii | Georg-August University of Göttingen | Germany
We introduce the simple slice sampling algorithm for approximate sampling and show Wasserstein contraction for all distributions with log-concave and rotational invariant densities. This yields, in particular, an explicit quantitative lower bound of the spectral gap. Moreover, this lower bound carries over to more general target distributions depending only on the volume of the (super-)level sets of their unnormalized density. In addition to that, if time permits, we discuss several details of elliptical slice sampler, especially its reversibility and a way of proving Wasserstein contraction using weak Harris theorem.
17:30
Exploiting geometry for walking larger steps in Bayesian Inverse Problems
Alexandre Thiery | National University of Singapore | Singapore
Show details
Author:
Alexandre Thiery | National University of Singapore | Singapore
Consider the noisy observation of a quantity of interest through a non-linear forward operator. In Bayesian inverse problems, the quantity of interest typically represents the high-dimensional discretization of a continuous and unobserved field while the evaluations of the non-linear forward operator involve solving a system of partial differential equations. In the low-noise regime, the posterior distribution concentrates in the neighbourhood of a non-linear manifold. As a result, the efficiency of standard MCMC algorithms typically deteriorates due to the need to take increasingly smaller steps.
In this work, we present a constrained HMC algorithm that is robust in the small regime -- the efficiency of the sampler does not deteriorate as the posterior distribution concentrates on an increasingly smaller neighbourhood of a nonlinear manifold. Taking the observations generated by the model to be constraints on the prior, we define a manifold (on an extended space) on which the constrained HMC algorithm generates samples. By exploiting the geometry of the manifold, our algorithm is able to take larger step sizes than more standard MCMC methods, resulting in a more efficient sampler.
18:00
Robust Markov chain Monte Carlo methods with respect to tail and scaling properties
Kengo Kamatani | Osaka University | Japan
Show details
Author:
Kengo Kamatani | Osaka University | Japan
In this talk, we will discuss Markov chain Monte Carlo (MCMC) methods with light and heavy-tailed invariant probability distributions. We study the ergodic properties of some MCMC methods with position-dependent proposal kernels and apply them to light and heavy-tailed target distributions. We also discuss briefly robustness to heterogeneity recently proposed by Livingstone and Zanella.