The overwhelming majority of modern applications in the natural sciences, engineering, and beyond require both statistical estimation to accurately quantify the behavior of unknown distributed parameters in complex systems as well as a means of making optimal decisions that are resilient to this uncertainty. In this minisymposium, we aim to connect researchers working in optimization of complex systems under uncertainty such as equilibrium problems, differential algebraic equations, and partial differential equations, with statisticians working in variational statistics, infinite-dimensional statistical estimation, and optimum experimental design.
10:30
Extended M-Estimation
Thomas M. Surowiec | Philipps-Universität Marburg | Germany
Show details
Authors:
Thomas M. Surowiec | Philipps-Universität Marburg | Germany
Drew P. Kouri | Sandia National Laboratories | United States
M-estimation was introduced by Huber in the 1960s for estimating a given parameter $\theta$ that maximizes a payoff function. However, traditional M-estimation, which includes maximum likelihood estimation and nonlinear least-squares regression, does not include additional (subjective or data-driven) prior information. In addition, without resorting to asymptotic statements, we obtain no information on the distribution of $\theta$. In contrast, Bayesian inference offers exactly this information, but typically requires strong assumptions on the prior and noise distributions to aid computation. A major step joining the two philosophies was proposed by Tong Zhang in a series of papers in 2006, where he introduced the notion of Gibbs posterior. We propose a versatile approach that attempts to overcome the potential shortcomings of these traditional estimation procedures. In particular, we make no explicit assumptions on the absolute continuity of the prior with respect to a nominal measure and we allow for application specific measures of information beyond the Kullback-Leibler distance used in the Gibbs posterior. The proposed procedure, which we refer to as extended M-estimation, can nevertheless reproduce maximum likelihood estimation, Bayesian inference, and the Gibbs posterior as special cases.
11:00
Trend Filtering on Graphs for Exponential Families
Robert Bassett | Naval Postgraduate School | United States
Show details
Author:
Robert Bassett | Naval Postgraduate School | United States
Trend filtering is a nonparametric family of adaptive estimators on graphs, based on penalizing the $\ell_{1}$ norm of any order of discrete derivativer. We extend trend filtering from Gaussian observations to those generated by exponential families, and also allow any penalty which is the $\ell_{1}$ norm of linear map. We discuss rates of convergence for trend filtering in this general case, and also give recommendations for its computation. Because these estimators are minimizers of a convex program, they are extremely tractable in practice. We conclude by demonstrating the utility of trend filtering for exponential families on a large scale climate dataset.
11:30
Scalable Gaussian Process Analysis using Hierarchical Off-Diagonal Low Rank Linear Algebra
Mihai Anitescu | Argonne National Laboratory | United States
Show details
Author:
Mihai Anitescu | Argonne National Laboratory | United States
We present a kernel-independent method that applies hierarchical matrices to the problem of maximum likelihood estimation for Gaussian processes. The proposed approximation provides natural and scalable stochastic estimators for its gradient and Hessian, as well as the expected Fisher information matrix, that are computable in quasilinear $O(nlog2n)$ complexity for a large range of models.
12:00
A-optimal design of large-scale Bayesian linear inverse problems under uncertainty
Noemi Petra | University of California, Merced | United States
Show details
Authors:
Noemi Petra | University of California, Merced | United States
Alen Alexanderian | North Carolina State University | United States
Georg Stadler | Courant Institute of Mathematical Sciences, New York University | United States
Isaac Sunseri | North Carolina State University | United States
We present an efficient method for computing A-optimal experimental designs (OED) for infinite-dimensional Bayesian linear inverse problems governed by partial differential equations (PDEs)
under multiple sources of uncertainties. Specifically, we address the problem of optimizing the location of sensors (at which observational data are collected) to minimize the posterior uncertainty in a so-called primary uncertain parameter while taking into account the additional (secondary) uncertainty in our models. Computing A-optimal designs entails computing trace estimation in the infinite-dimensional parameter space. This is challenging due to high dimensionality of the discretized parameter field. In many large-scale inverse problems, however, the dimension of the measurement space is considerably smaller than the dimension of the parameter space. In this talk we exploit this and show an alternate OED formulation that facilitates trace estimation in the measurement space. We present numerical results for inference of the initial condition from spatio-temporal observations in a time-dependent advection-diffusion problem in which the secondary uncertainty is a volume source term.