In many scientific disciplines, researchers encounter inverse problems where observational data shall be used to calibrate mathematical models. Hadamard considered the solvability of such problems in terms of their "well-posedness". He called a problem well-posed, if a solution exists, if the solution is unique, and if the solution depends continuously on the data. Inverse problems are typically not well-posed (i.e., ill-posed) and require some regularization. Today's availability of high-performance computing has raised the popularity of statistical approaches to inverse problems and probabilistic regularizations; like the Bayesian approach.
In this minisymposium, we consider the robustness and non-robustness (that is, the brittleness) of Bayesian inverse problems and related approaches. This includes the robustness with respect to perturbations in the data (that is, the well-posedness), but also with respect to perturbations in the prior measure or the likelihood. Perturbations in the
prior also include a potential ill-specification of the prior model whereas perturbations in the likelihood include the replacement of the mathematical model by a discretised version or a surrogate. Moreover, we are interested in the robustness of algorithms used for statistical inversion, such as MCMC, particle filters, variational Bayes, and approximate Bayesian computation.
On the Local Lipschitz Robustness of Bayesian Inverse Problems
Björn Sprungk | TU Bergakademie Freiberg | Germany
We consider the robustness of posterior measures occuring in Bayesian inverse problems w.r.t. perturbations of the prior measure and the log-likelihood function. In particular, we prove a general local Lipschitz continuous dependence of the posterior on the prior and the log-likelihood w.r.t. various common distances of probability measures. These include the Hellinger and Wasserstein distance and the Kullback-Leibler divergence. The obtained robustness yields as a corollary the well-posedness of Bayesian inverse problems in the Wasserstein distance. Moreover, our results indicate an increasing sensitivity to perturbations of prior or likelihood as the posterior becomes more concentrated, e.g., due to more or more accurate data. This confirms and extends previous observations made in the robust Bayesian analysis literature. We also comment on the relation between the presented robustness and the phenomenon of Bayesian brittleness.
- NEW - Consistency of Bayesian inference with Gaussian priors in an elliptic nonlinear inverse problem
Matteo Giordano | University of Cambridge | United Kingdom
We consider nonparametric Bayesian inference for nonlinear inverse problems based on Gaussian process priors. We present posterior consistency results for the problem of recovering the unknown conductivity in an elliptic PDE in divergence form from noisy discrete observations of its solution, and give a convergence rate for the reconstruction error of the associated posterior mean estimator. The analysis is based on a contraction rates theory for the induced regression problem, combined with a stability estimate for the solution map of the PDE.
MAP estimators and posterior consistency for Bayesian inverse problems for functions
Masoumeh Dashti | University of Sussex | United Kingdom
We consider the Bayesian approach to the problem of recovering an unknown function from noisy and indirect observations. We discuss some results on characterisation of the modes of the posterior measure which lead to some weak consistency results. We then show a contraction theory for the posterior measure for a class of exponential priors.