Many modern simulators of physically realistic phenomena use multiple, heterogeneous sub-models, possibly involving different types of physical modelling and dimensionality. This usage poses many challenges that need to account for the links across sub-models: building surrogates, designing experiments, exploring sensitivities, reducing dimensions, etc. Both theoretical investigations and implementations are hampered by the complex nature of such models, and need to be tailored to the specific chain of models. In this mini-symposium, we present a series of talks that address such challenges and offer theoretical as well as practical solutions, together with illustrations. In particular, realistic models of geophysical and biological hazards often include feedbacks across sub-models or are combinations of sub-models of precursory phenomena – models that set the stage for dangerous events and can be informed by monitoring data – as well as the models of the hazardous phenomenon itself. These challenges require solutions that acknowledge the interactions across multi-physics components.
14:00
- CANCELED - An efficient dimension reduction for the Gaussian process emulation of two nested codes with functional outputs
Sophie Marque-Pucheu | Orange Gardens | France
Show details
Authors:
Sophie Marque-Pucheu | Orange Gardens | France
Guillaume Perrin | CEA/DAM/DIF | France
Josselin Garnier | Centre de Mathematiques Appliquees, Ecole Polytechnique | France
In this talk, we first propose an efficient method for the dimension reduction of the functional input of a code with functional output. It is based on the approximation of the output by a model which is linear with respect to the functional input. This approximation has a sparse structure, whose parameters can be accurately estimated from a small set of observations of the code. The Gaussian predictor based on this projection basis is significantly more accurate than the one based on a projection obtained with Partial Least Squares. Secondly, the surrogate modeling of two nested codes with functional outputs is considered. In such a case, the functional output of the first code is one of the inputs of the second code. The Gaussian process regression of the second code is performed using the proposed dimension reduction. A Gaussian predictor of the nested code is obtained by composing the predictors of the two codes and linearizing this composition. Finally, two sequential design criteria are proposed. Since we aim at performing a sensitivity analysis, these criteria are based on a minimization of the prediction variance. Moreover, one of the criteria enables to choose, if it is possible, which of the two codes to run. Thus, the computational budget is optimally allocated between the two codes and the prediction error is substantially reduced.
14:30
Integrated Emulation of Systems of Simulators
Deyu Ming | University College London | United Kingdom
Show details
Authors:
Deyu Ming | University College London | United Kingdom
Serge Guillas | University College London | United Kingdom
We construct an integrated emulator of a feed forward (multi-physics or even multi-disciplinary) system of simulators with possible internal feedback loops, by integrating out its internal inputs using the Gaussian process emulators of individual simulators. The integrated emulator predicts the global output of the system using a Gaussian distribution with the mean and variance as explicit functions of global inputs. Analytical solutions for the mean and variance are established for a variety of kernels (exponential, squared exponential, and two key Matérn kernels). We compare our integrated emulator with the composite emulator, which is the emulator of the entire system using only global inputs and outputs. The integrated emulator exploits the functional relationships across individual simulators whereas the composite emulator inherently ignores them. As a result, the integrated emulator learns the behaviour of the system better than the composite emulator, with identical training points (even often with a lower number of training points). Furthermore, it allows new sampling strategies whereby the cheapest simulators can be run more times in order to improve the overall quality of the integrated emulator, without running the more expensive ones. Such strategies could reduce by several orders of magnitude the costs of training the integrated emulator of a system with heterogeneous simulators. We also demonstrate the skills of our method in a multi-disciplinary problem.
15:00
Approaches to the Emulation of Chains of Computer Models with Application to Epidemic Policy Making
David Woods | University of Southampton | United Kingdom
Show details
Authors:
David Woods | University of Southampton | United Kingdom
Samuel Jackson | University of Southampton | United Kingdom
We have developed novel Bayesian emulation methodology to analyse chains of computer models, where the outputs of one model feed into the next. Such computer models, with inputs and outputs representing quantities of interest, are frequently developed to help understand real-world processes. The motivation for this work comes from linking atmospheric dispersion, dose-response and epidemiological models. Emulation is a well tested approach to efficiently understand computationally intensive models. Yet, for analysing chains of computer models, these approaches often focus on approximating the entire chain using a single emulator. We focus on linking Bayes linear emulators of each component model of a chain and have developed emulators for models where inputs are uncertain. A first method proposes analysing each emulator’s behavior for a sample of inputs arising from a probabilistic distribution commensurate with our beliefs about output of the previous emulator. A second method extends the field of emulation to directly incorporate uncertain inputs within each emulator itself. We demonstrate the potential of these novel approaches on intuitive examples before demonstrating on an application of modelling epidemic diseases. Application of our techniques to models of such epidemics permits detailed uncertainty quantification via, for example, thorough sensitivity analysis into the effect of unknown quantities, thus aiding online policy decision making in the event of an epidemic.
15:30
- CANCELED - Probabilistic hazard mapping in a volcanic field under rapidly evolving monitoring signals: integration of probability maps of vent opening location and physical models of pyroclastic density currents
Andrea Bevilacqua | Istituto Nazionale di Geofisica e Vulcanologia | Italy
Show details
Authors:
Andrea Bevilacqua | Istituto Nazionale di Geofisica e Vulcanologia | Italy
Mattia de' Michieli Vitturi | Istituto Nazionale di Geofisica e Vulcanologia | Italy
Tomaso Esposti Ongaro | Istituto Nazionale di Geofisica e Vulcanologia | Italy
Augusto Neri | Istituto Nazionale di Geofisica e Vulcanologia | Italy
Abani Patra | Tufts University | United States
Marcus Bursik | University at Buffalo | United States
E. Bruce Pitman | University at Buffalo | United States
Elaine Spiller | Marquette University | United States
Pyroclastic density currents (PDCs) are laterally moving, expanding mixtures of hot gas and fragmental particles. PDCs represent one of the most dangerous hazards during explosive eruptions, and their hazard mapping is important for risk mitigation purposes. In this study, we perform PDC simulations in the Campi Flegrei caldera through 2D depth-averaged granular flow models. We implement a Monte Carlo simulation of the volume of the mixture, the parameters in the rheology of the flow, and the new eruptive vent location. Campi Flegrei caldera is an active and densely populated volcanic area in the urban neighborhood of Napoli, characterized by the presence of many, dispersed past eruptive vents, which complicate the forecasting of a new vent location. Single, point-like eruptive vents are the idealized initiation sites of the PDCs. We model the uncertainty affecting the location of new vents through a pdf distributed over the volcanic field. This pdf is dynamic and new monitoring information can change it in a few hours or days with a Bayesian update. The development of an efficient strategy to propagate this change in the hazard maps by re-using the available numerical simulations is an important scientific target, with major societal impact during a volcanic crisis. Here we present various approaches utilizing both bootstrap strategies and statistical emulators capable of coupling the PDC hazard mapping with the evolving model of the new vent locations.