08:30
Simulator Inputs Screening with One-Factor-at-a-Time Designs
Giovanni Rabitti | Department of Decision Sciences, Bocconi University, Milan | Italy
Show details
Authors:
Emanuele Borgonovo | Department of Decision Sciences, Bocconi University, Milan | Italy
Giovanni Rabitti | Department of Decision Sciences, Bocconi University, Milan | Italy
Dimension reduction is an essential task in simulation. Best practices suggest the use of global indicators, and recent research is exploring their connection to screening methods. Taking a fresh look at the Morris method, we show that the associated simulator runs yield estimates of Sobol’ individual, total and interaction indices as well as of the simulator mean dimension. The exact relationship between Morris sensitivity measures and Sobol’ total indices is established. The associated asymptotic analysis allows one to obtain confidence intervals around the estimates at finite sample sizes. The resulting confidence intervals allow the introduction of a conservative screening criterion that takes uncertainty in the estimates into account. Numerical experiments are carried out on benchmark simulators such as the Asian option, the assemble-to-order and the space probabilistic safety assessment models. The simulator runs enable the analyst not only to identify irrelevant inputs, but also to obtain insights on the size of interactions and on the trend of the simulator response.
08:50
The connection between Bayesian Inference and Information Theory for model selection, information gain and experimental design
Sergey Oladyshkin | University of Stuttgart | Germany
Show details
Authors:
Sergey Oladyshkin | University of Stuttgart | Germany
Wolfgang Nowak | University of Stuttgart | Germany
We show a link between Bayesian inference and information theory that is useful for model selection, assessment of information entropy and experimental design. We align Bayesian model evidence (BME) with relative entropy and cross entropy in order to simplify computations using prior-based (Monte Carlo) or posterior-based (Markov chain Monte Carlo) BME estimates. On the one hand, we demonstrate how Bayesian model selection can profit from information theory to estimate BME value via posterior-based techniques. Hence, we use various assumptions including relations to several information criteria. On the other hand, we demonstrate how relative entropy can profit from BME to assess information entropy during Bayesian updating and to assess utility for Bayesian experimental design. In specific, we emphasize that relative entropy can be computed avoiding unnecessary multidimensional integration from both prior and posterior-based sampling techniques. Prior-based computation does not require any assumptions, however posterior-based estimates require at least one assumption. We illustrate the performance of the discussed estimates of BME, information entropy and experiment utility using a very simple example. The multivariate Gaussian posterior estimate contains least assumptions and shows the best performance for BME estimation, information entropy and experiment utility from posterior-based sampling.
09:10
Adaptive Design of Experiments for Global Surrogate Modeling through Cross-Validation information
Aikaterini Kyprioti | University of Notre Dame | United States
Show details
Authors:
Aikaterini Kyprioti | University of Notre Dame | United States
Jize Zhang | Lawrence Livermore National Laboratory | United States
Alexandros Taflanidis | University of Notre Dame | United States
This presentation will discuss a new sequential adaptive design of experiments (DoE) approach for global Kriging metamodeling applications. The sequential implementation is established by using the current metamodel, formulated based on the existing experiments, to guide the selection of the optimal new experiment(s). The score function, defining the DoE objective, combines two components; the metamodel prediction variability, expressed through the predictive variance, and the metamodel bias, approximated through the leave-one-out cross validation (LOOCV) error. The former will be expressed here by popular score functions, such as the integrated mean squared error (IMSE) and the maximum mean squared error (MMSE), while the latter will be used as a weighing factor. The incorporation of bias information as weighting facilitates a direct extension of existing workflows that rely on sole use of the aforementioned score functions, making the proposed implementation attractive from computational perspective (seamless integration in such workflows). An efficient optimization scheme for identification of the next experiment, as well as the balance of exploration and exploitation between the two components of the score function (metamodel bias and variability), will be also discussed. Analytical and engineering examples will be presented to showcase efficiency of proposed scheme compared to popular alternatives.
09:30
Effect of random parameters in nonlinear regression on an optimal experimental design
Daniela Jaruskova | Czech Technical University | Czech Republic
Show details
Author:
Daniela Jaruskova | Czech Technical University | Czech Republic
We consider a nonlinear regression model that describes a relationship between input parameters and an output of an experiment. The output is measured repeatedly in several time points. The regression function is supposed to contain additional random parameters that remain the same in a single experiment but differ from one experiment to the other. Due to the additional random parameters the variability of least squares (LS) estimates of the parameters of interest may be large and their distribution may be not normal. For choosing a good design of experiment we might be interested in the complete joint distribution of the LS estimates. We compare three methods for approximating this distribution and illustrate the methods by some examples.
09:50
Bioprocess design space identification using constrained global sensitivity analysis
Panagiotis Demis | University of Surrey | United Kingdom
Show details
Authors:
Panagiotis Demis | University of Surrey | United Kingdom
Sergei Kucherenko | Imperial College London | United Kingdom
Oleksiy Klymenko | University of Surrey | United Kingdom
Recent advances in targeted protein-based therapies have underpinned a rapid growth in the number of approved biopharmaceuticals. The Quality-by-Design (QbD) principles adopted by the FDA imply that manufacturing processes for such therapeutics must be designed and operated in accordance with stringent requirements on their Key Performance Indicators (KPIs) and Critical Quality Attributes (CQAs) of the product. QbD approaches are based on the quantitative understanding of the process with CQAs and KPIs acting as constraints to ensure desired product quality, reduce testing and ultimately reduce the cost of treatment for patients. The identification of the process operational Design Space (DS) as a subspace of input parameters that satisfies QbD constraints is therefore of utmost importance.
DS identification can be extremely computationally demanding since comprehensive sampling of the input parameter space and numerous evaluations of the process model are required. Moreover, explicit description of the DS becomes more complicated and computationally expensive with increasing input space dimensionality. We have developed a novel approach involving previously developed by us constrained Global Sensitivity Analysis (cGSA) for input space dimensionality reduction. This allows to improve the computational efficiency by both reducing the sampling of the infeasible input subspace and simplifying the problem of finding an explicit description of the DS.
10:10
Application of experimental design-based assisted history matching for uncertainty quantification in radioactive waste repositories
Jörg Buchwald | Helmholtz Zentrum für Umweltforschung GmbH - UFZ | Germany
Show details
Authors:
Jörg Buchwald | Helmholtz Zentrum für Umweltforschung GmbH - UFZ | Germany
Aqeel Afzal Chaudhry | Technische Universität Bergakademie Freiberg | Germany
Olaf Kolditz | Helmholtz Zentrum für Umweltforschung GmbH - UFZ | Germany
Sabine Attinger | Helmholtz Zentrum für Umweltforschung GmbH - UFZ | Germany
Thomas Nagel | Technische Universität Bergakademie Freiberg | Germany
In the performance assessment of nuclear waste disposal concepts, a thorough analysis of uncertainty and sensitivity of the underlying coupled thermo-hydro-mechanical-chemical (THMC) processes over several temporal and spatial scales is necessary. A critical aspect of this analysis is the evaluation of the integrity of the geotechnical and geological barriers for the specific waste repository concept.
For this purpose, we examine the applicability of a design of experiments (DoE)-based assisted history matching workflow using synthetic experimental data — an approach that is commonly used in the oil and gas industry in the context of reservoir modeling.
Based on an analytical solution of a spherically-symmetric thermo-hydro-mechanical problem of a heat source embedded in a fluid-saturated porous medium, we discuss the adaptability of the workflow to the field of radioactive waste disposal research as a potential way to address both, parameter and model uncertainties. We put thereby particular focus on the role of the input parameter distributions on the uncertainty of the model output.