Gaussian Random Fields (GRFs) are ubiquitous models of random functions in computational UQ. Their efficient numerical treatment as input data for PDEs, for modeling in spatial statistics and as key building block within larger UQ simulation loops continues to receive attention in numerical analysis, spatial statistics, and scientific computing.
This mini-symposium will present contributions at the forefront of research, addressing among others the fast and compressive multilevel simulation of GRFs, the impact of formatted numerical linear algebra (H- and H2-Matrix formats, Quantized Tensor Trains) on GRF simulation and identification, efficient covariance estimation algorithms for GRFs, the interplay of massively parallel PDE solvers and GRFs, the multilevel Monte Carlo and Quasi-Monte Carlo integration for GRF PDE inputs, as well as statistical applications.
14:00
Multilevel representations of stationary Gaussian random fields and efficient sampling methods
Markus Bachmayr | Johannes Gutenberg-Universität Mainz | Germany
Show details
Author:
Markus Bachmayr | Johannes Gutenberg-Universität Mainz | Germany
The extension of stationary random fields on domains to periodic random fields on a torus turns out to be useful both for drawing samples and for obtaining series expansions of such random fields. In this talk we discuss recent results on the properties of such extensions (where the classical circulant embedding can be regarded as a special case), as well as their implications in UQ applications.
Based on joint works with Albert Cohen, Ivan Graham, Giovanni Migliorati, Van Kien Nguyen, and Robert Scheichl.
14:30
Fast and exact simulation of univariate and bivariate Gaussian random fields
Olga Moreva | Daimler AG, Werk Sindelfingen | Germany
Show details
Author:
Olga Moreva | Daimler AG, Werk Sindelfingen | Germany
Circulant embedding is a powerful algorithm for fast simulation of stationary Gaussian random fields on a rectangular grid in R^n, which works perfectly for compactly supported covariance functions. Cut‐off circulant embedding techniques have been developed for univariate random fields for dimensions up to R^3 and rely on the modification of a covariance function outside the simulation window, such that the modified covariance function is compactly supported. In this paper, we propose extensions of the cut‐off approach for univariate and bivariate Gaussian random fields. In particular, we provide a method for simulating bivariate fields with a powered exponential model and the Matérn model for certain sets of parameters.
15:00
Efficient simulation of deep Gaussian processes
Michael Feischl | TU Wien | Austria
Show details
Author:
Michael Feischl | TU Wien | Austria
We use low rank representation techniques together with Krylow subspace methods to efficiently simulate samples of deep Gaussian processes. Samples of those random fields are generated by using a sample of a standard Gaussian process as covariance information for a second Gaussian process. This procedure can be repeated to an arbitrary depth and hence generates a deep Gaussian process.
The advantage of deep Gaussian processes is that they can model complex phenomena while relying on a very simple construction.
This versatility, however, prevents the use of off-the shelf simulation techniques like circulant embedding.
Challenges arise from the unknown and random eigenvalue distribution of covariance matrices in deeper layers of those processes as well as from the non-smooth covariance structure. Even when starting with an analytic covariance function in the first layer, the subsequent layers can be arbitrarily rough.
15:30
Fast sampling of parameterised Gaussian random fields
Jonas Latz | University of Cambridge | United Kingdom
Show details
Author:
Jonas Latz | University of Cambridge | United Kingdom
Gaussian random fields are popular models for spatially varying uncertainties, arising, e.g., in geotechnical engineering, hydrology or image processing. A Gaussian random field is fully characterised by its mean and covariance operator. In more complex models these can also be partially unknown. In this case we need to handle a family of Gaussian random fields indexed with hyperparameters. Sampling for a fixed configuration of hyperparameters is already very expensive due to the nonlocal nature of many classical covariance operators. Sampling from multiple configurations increases the total computational cost severely. In this talk we employ parameterised Karhunen-Loève expansions for sampling. To reduce the cost we construct a reduced basis surrogate built from snapshots of Karhunen-Loève eigenvectors. In particular, we consider Matérn-type covariance operators with unknown correlation length and standard deviation. We suggest a linearisation of the covariance function and describe the associated online-offline decomposition. In numerical experiments we investigate the approximation error of the reduced eigenpairs. As an application we consider forward uncertainty propagation and Bayesian inversion with an elliptic partial differential equation where the logarithm of the diffusion coefficient is a parameterised Gaussian random field. In the Bayesian inverse problem we employ Markov chain Monte Carlo on the reduced space to generate samples from the posterior measure.