The synthesis of various information sources, including a priori domain knowledge, statistical assumptions, field data, etc., large-scale numerical models is one of the key steps in building interpretable and predictive models for supporting critical decisions in science, engineering, medicine, and beyond. Typical examples can be found in oil/gas reservoir modeling, treatment of saltwater intrusion, medical imaging, tumor treatment, aircraft design. Because of the computationally costly nature of the numerical models and stringent requirements on the accuracy of the statistical learning outcomes, multilevel and multi-fidelity methods provide a viable route for solving these model-based statistical learning tasks. This mini-symposium will bring together researchers working on the forefront of multilevel and multi-fidelity methods (and other relevant methods) intended to accelerate model-based statistical learning tasks.
08:30
Embedded multilevel Monte Carlo for UQ on random domains
Jerrad Hampton | University of Colorado Boulder | United States
Show details
Authors:
Santiago Badia | Monash University | Australia
Javier Principe | Universitat politécnica de Catalunya | Spain
Jerrad Hampton | University of Colorado Boulder | United States
The Multi-Level Monte Carlo (MLMC) technique has proven to be an effective variance-reduction method for uncertainty quantification in PDE models. It combines approximations at different levels of accuracy using hierarchical meshes in a similar way as multigrid iterative linear solvers. The generation of a mesh hierarchy is possible for simple geometries but impractical in general body-fitted meshes. On the other hand, MLMC for complex random domains involves the generation of a body-fitted mesh for every sample. Instead, here we consider the extension of MLMC to embedded methods, in a process described as Embedded MLMC (EMLMC). In particular, we design an EMLMC framework for (geometrically and topologically) random domains implicitly defined through a random level-set function, which makes use of a set of hierarchical background meshes and the AgFEM. Performance predictions are derived from existing theory. These predictions are verified statistically in three numerical experiments, namely the solution of Poisson's equation on a circular domain of random radius, the solution of Poisson's equation on a topologically identical but more complex domain, and the solution of a heat-transfer problem in a domain that has geometric and topological uncertainty. Finally, the use of AgFEM is statistically demonstrated to be crucial for complex and uncertain geometries in terms of robustness and computational cost, verifying the importance of this component in the EMLMC method.
09:00
- CANCELED - Adaptive Multi-fidelity surrogate modeling for Bayesian inference in inverse problems
Liang Yan | Southeast University | China
Show details
Authors:
Liang Yan | Southeast University | China
Tao Zhou | Chinese Academy of Sciences | China
Performing Bayesian inference via MCMC can be exceedingly expensive when posterior evaluations invoke the evaluation of a computationally expensive model, such as a system of PDEs. One strategy is to replace the forward model with a low-cost surrogate model; however, simply replacing the high-fidelity model with a low-fidelity model can lead to a lower approximation quality result. In this talk, we seek to address this challenge by introducing an adaptive procedure to construct a multi-fidelity polynomial chaos surrogate and explore the posterior simultaneously. More precisely, the new strategy starts with a low-fidelity surrogate model, and then correct it adaptively using online high-fidelity data. The key idea is to speed up the MCMC by combing, instead of replacing, the high-fidelity model with the low-fidelity model. We also introduce a multi-fidelity surrogate based on the deep Neural Networks to deal with problems with high dimensional parameters. Numerical experiments confirm that the proposed approach can obtain accurate posterior information with a limited number of forward simulations.
09:30
Context-aware model reduction for multifidelity importance sampling
Terrence Alsup | New York University | United States
Show details
Authors:
Terrence Alsup | New York University | United States
Benjamin Peherstorfer | New York University | United States
Multifidelity methods leverage low-cost surrogate models to speed up computations and make occasional recourse to expensive high-fidelity models to establish accuracy guarantees. Because surrogate and high-fidelity models are used together, poor approximation by the surrogate models can be compensated with frequent recourse to high-fidelity models. Thus, there is a trade-off between investing computational resources to improve surrogate models and the frequency of making recourse to high-fidelity models; however, this trade-off is ignored by traditional model reduction methods that construct surrogate models that are meant to replace high-fidelity models rather than being used together with high-fidelity models. In this presentation, we consider multifidelity importance sampling and explicitly take into account the trade-off between improving the approximation quality of surrogate models for constructing biasing densities and the frequency of recourse to the high-fidelity models to estimate statistics. Given a total computational budget, an optimization problem determines how much of the budget to invest into constructing a surrogate model versus sampling the high-fidelity model with the objective to minimize the error of the estimator. Numerical examples demonstrate that optimal surrogate models have significantly lower fidelity than what typically is set as tolerance in traditional model reduction, leading to runtime speedups in our examples.
10:00
Squashing the banana: Transport map-accelerated adaptive importance sampling
Simon Cotter | University of Manchester | United Kingdom
Show details
Authors:
Simon Cotter | University of Manchester | United Kingdom
Ioannis Kevrekidis | Johns Hopkins University | United States
Paul Russell | University of Manchester | United Kingdom
Sampling from probability distributions with complex structure can be very challenging, as the ubiquitous Metropolis-Hastings method can exhibit poor mixing. In many applications this complexity can manifest itself in the target distribution being concentrated on a lower dimensional manifold. Without using a method that exploits this structure, proposals will often be made off the manifold and rejected. Transport maps have recently been used as a tool to effectively simplify this complex structure and accelerate Metropolis-Hastings algorithms in such a scenario. In this talk we will see how the same approach can accelerate and stabilize ensemble importance sampling schemes, a family of methods which have favourable properties, including the potential to be very efficient on parallel architectures.