One of the most critical hypothesis in uncertainty quantification studies is the choice of the distributions of uncertain input variables which are propagated through the numerical model. In general, such pdf come from various sources (statistical inference, design or operation rules, expert judgment, calibration, etc.), and are then established with a certain level of accuracy or confidence. Moreover, in many applications, related for example to industrial safety, engineers are not able to assign a given probability distribution to some of the inputs. This happens for example for inputs corresponding to physical parameters for which no data are available.
Hence, bringing stringent justifications to the overall approach requires quantifying the impact of the pdf modeling assumptions on the quantity of interest (QoI). In this context, the “input pdf robustness analysis”, has been recently defined as a particular setting of the sensitivity analysis domain (like the screening one or the quantitative partitioning one). Various QoI can be considered, as the mean of the model output, its variance, a probability that the output exceeds a threshold, a quantile of the output or even sensitivity indices.
This Minisymposium, which will be held in two parts (4 presentations in each part), aims at presenting several recent theoretical developments on this subject, as well as practical and industrial issues.
14:00
The multi-model approach to uncertainty quantification and propagation from small data sets: An overview
Michael D. Shields | Johns Hopkins University | United States
Show details
Author:
Michael D. Shields | Johns Hopkins University | United States
n the absence of sufficiently large data sets from which to perform statistical inference, we are left with significant uncertainty in the form and parameters of probability models for random variables described by such data. Recognizing that this uncertainty is often unavoidable, it becomes essential to incorporate this uncertainty into mathematical and computational models of systems of interest. Inspired by the seminal work on multi-model inference of Burnham and Anderson [1], we recently proposed a new method to quantify this uncertainty and propagate it through a mathematical model. By defining a multi-model set, which is comprised of a large statistical set of candidate probability distributions, each having known probability of being the “best” model for the data, we quantify the distribution-form uncertainty. This can be done in either a fully Bayesian setting or in a combined Bayesian-Information Theoretic setting. Both approaches will be discussed. I then discuss the efficient propagation of this multi-model uncertainty, considering both independent and dependent random variables, for a variety of objectives: solution moment estimation, global sensitivity analysis, and reliability analysis. These methods are applied to engineering systems in structural mechanics and composite materials modeling.
[1] Burnham, K. P., & Anderson, D. R. (2004). Multimodel inference: understanding AIC and BIC in model selection. Sociological methods & research, 33(2), 261-304.
14:30
Post-Optimal Design using Optimal Uncertainty Quantification
Luc Bonnet | ONERA - University Paris-Saclay | France
Show details
Authors:
Luc Bonnet | ONERA - University Paris-Saclay | France
Eric Savin | ONERA | France
The design performance as given by the manufacturers may usually differ from the operational performance, due to the variability of some operational parameters. Usually, the design development is divided into two different phases. The first phase is to determine the pre-optimal design. Through the use of numerical software, the best possible design is chosen considering some ideal performances to be achieved. The second phase is to certify through full-scale experiments that the design determined previously is valid. In this way, the post-optimal design is specified. This second phase is the most costly one. In that respect, manufacturers seek to lower the utilization of full-scale experiments.
Optimal Uncertainty Quantification is a powerful mathematical tool which can be used to rigorously bound the probability of exceeding a given performance threshold for uncertain operational conditions or system characteristics. This mathematical tool can be formulated as an optimization problem over a set of admissible probability measures. This problem is a priori infinite-dimensional, non-convex and highly-constrained. Thus, it is generally computationally intractable. Nevertheless, it can be reduced to an equivalent finite-dimensional optimization problem over extreme points of the set of admissible probability measures.
The proposed framework will be tested for the robust certification of some performance criteria of an aircraft.
15:00
Optimal Uncertainty Quantification of a risk measurement on moment class
Jérôme Stenger | Université Toulouse Paul Sabatier | France
Show details
Authors:
Jérôme Stenger | Université Toulouse Paul Sabatier | France
Fabrice Gamboa | Université Toulouse Paul Sabatier | France
Merlin Keller | EDF R&D | France
In uncertainty quantification study, we model the uncertain input parameters as random variables. The choice of the probability distributions usually come from expert judgment and/or statistical inference. Therefore, their establishment lack accuracy and result in a second level uncertainty.
In this work we gain robustness on the quantification of a risk measurement by accounting for all sources of uncertainties tainting the inputs of a computer code. To that extent, we evaluate bounds on the quantity of interest over a class of bounded distributions satisfying constraints on their moments, called “moment class”.
The problem can be reformulated as the optimization of a quantity of interest over a compact convex set of probability measures. This set is infinite dimensional and nonparametric, so that the optimization of the quantity of interest is generally computationally intractable. However, when the quantity of interest is a lower semi-continuous and quasi-convex functional (for instance a probability of failure, a quantile, a moment, a Sobol index ...), the optimum can be computed only on the extreme points of the set of probability measure.
We identify a well suited parameterization of the extreme points of the moment class based on the theory of canonical moments. It allows an effective, free of constraints, optimization of the quantity of interest. This methodology is applied on a representative test-case about safety evaluation of a flood protection dike.
15:30
Reliability-oriented Sobol’ indices under distribution parameter uncertainty
Vincent Chabridon | EDF R&D | France
Show details
Authors:
Vincent Chabridon | EDF R&D | France
Mathieu Balesdent | ONERA | France
Guillaume Perrin | CEA/DAM | France
Jérôme Morio | ONERA | France
Jean-Marc Bourinet | Université Clermont Auvergne | France
Nicolas Gayton | Université Clermont Auvergne | France
Uncertainty quantification (UQ) is a global methodology relying on a probabilistic modeling of uncertain input variables of a computer model. Usually, the joint input probability distribution is supposed to be known and is set so as to represent the intrinsic stochastic behavior and the dependence of the inputs. However, if the type of the marginal distributions can be assessed by rational considerations (e.g., based on the best available statistical information or according to standards or expert judgment), the distribution parameters are often tainted with a residual statistical uncertainty. In the typical context of reliability assessment, this second uncertainty level does affect the robustness of the numerical results obtained in the rest of the UQ study and has to be taken into account in the failure probability estimation. This robustness can be investigated by performing reliability-oriented sensitivity analysis. The present work aims at presenting a set of dedicated reliability-oriented Sobol’ indices taking the bi-level input uncertainty into account. The separation between aleatory (irreducible) and epistemic (reducible) uncertainties is proposed via a disaggregated version of the input random variables. An efficient estimation strategy based on a splitting algorithm and an adapted kernel density estimation is proposed. The methodology is applied on a representative test-case about safety evaluation of a flood protection dike.