08:30
Integrating Subset Simulation with Asymptotic Sampling for Estimating Small Failure Probabilities
Matej Leps | Czech Technical University in Prague | Czech Republic
Show details
Authors:
Matej Leps | Czech Technical University in Prague | Czech Republic
Adela Hlobilova | Czech Technical University in Prague | Czech Republic
A novel simulation approach integrating Subset Simulation with Asymptotic Sampling is proposed to estimate small failure probabilities. Asymptotic Sampling is an advanced simulation technique that predicts a reliability index from an asymptotic behavior of the probability of failure in an n-dimensional i.i.d. space. Several crude Monte Carlo simulations with different standard deviations are performed to obtain a sequence of reliability indices and the final reliability index is extrapolated from this sequence. The drawback is in the extrapolation; however, the main benefit is in a higher probability of sampling the failure region with growing standard deviation. Subset Simulation is an adaptive sampling technique based on a formulation of the failure event as an intersection of the intermediate failure events. The rare event problem is then reformulated as a series of more frequent conditional events that are easier to solve. These series are sequentially sampled by Markov chain Monte Carlo. Although this method is proclaimed as very stable and powerful, it can fail with highly nonlinear functions. Therefore, this contribution combines the benefits of both sampling methods. First, Asymptotic Sampling is used either to allocate samples near the limit state or at least to determine the proper direction to the failure region and second, the estimate of the failure probability is made more precise with Subset Simulation.
08:50
An agent-based approach to explore complex dynamics of engineered systems for probabilistic safety analyses
Nadine Berner | Gesellschaft für Anlagen- und Reaktorsicherheit (GRS) gGmbH | Germany
Show details
Authors:
Nadine Berner | Gesellschaft für Anlagen- und Reaktorsicherheit (GRS) gGmbH | Germany
Josef Scheuer | Gesellschaft für Anlagen- und Reaktorsicherheit (GRS) gGmbH | Germany
The probabilistic investigation of dynamic behaviour is of paramount importance for a comprehensive risk assessment of complex engineered systems, such as nuclear facilities. In general, a deterministic simulation code is used to study the dynamic of a system in the presence of epistemic and aleatory uncertainties of safety relevant features. Given the complexity of the system, the consideration of uncertainties poses a demanding task in terms of the continuous probabilistic evaluation of the system state and of the efficient use of computational resources.
To dynamically explore the phase space of the system for critical transitions assigned with uncertainty, we present an agent-based approach that combines Monte Carlo (MC) based sampling with a Dynamic Event Tree (DET) approach. This MCDET approach is based on a generic scheduling concept, that - in tandem with a probabilistic evaluation strategy of the simulation after each time step – can be considered as agents exploring the phase space of the system along sequences of rare and critical state transitions. The acquired data hierarchy represents the sampled DETs for a probabilistically specified set of safety relevant transitions and allows to derive various probabilistic estimates of interest. Here, we explain the MCDET approach in detail based on a heated tank system and discuss the insights gained by applying the MCDET approach in complex safety studies of nuclear facilities.
09:10
Reliability-oriented sensitivity analysis in presence of data-driven epistemic uncertainty
Gabriel Sarazin | ONERA - The French Aerospace Lab | France
Show details
Authors:
Gabriel Sarazin | ONERA - The French Aerospace Lab | France
Jérôme Morio | ONERA - The French Aerospace Lab | France
Agnès Lagnoux | Université de Toulouse Jean Jaurès | France
Mathieu Balesdent | ONERA - The French Aerospace Lab | France
Loic Brevault | ONERA - The French Aerospace Lab | France
Reliability assessment often boils down to appraising a complex simulation code featuring a prohibitive input-output computational cost. The random uncertainty on input variables is modeled by a multivariate probability distribution. After propagating uncertainties, a failure criterion is defined on the resulting output distribution. In this work, the knowledge about the input distribution is incomplete and limited to a small number of observations coming from the field. A probabilistic model must be fitted to the data before using a reliability analysis algorithm to get an estimation of the failure probability related to this specific dataset. Particular focus is put on the sampling variability and the fact that another dataset from the same underlying distribution could have been provided and could have led to different risk estimation. The main purpose of this work is to identify which entity of the learnt distribution is responsible for the largest share of variability in risk assessment. Addressing this sensitivity analysis problem is not easy because the uncertainty sources are random batches deriving from the split of a correlated dataset. Our contribution consists of a numerical scheme to break this deadlock. Sobol indices are computed and the cornerstone in order to fairly apportion the output variance (between the copula and the margins) lies in a bootstrap-based resampling mechanism coupled with the pick-freeze method.
09:30
- CANCELED - Selecting Reduced Models in the Cross-Entropy Method
Patrick Héas | INRIA | France
Show details
Author:
Patrick Héas | INRIA | France
This work deals with the estimation of rare event probabilities using importance sampling, where an optimal proposal distribution is computed with the cross-entropy (CE) method. Although, importance sampling optimized with the CE method leads to an efficient reduction of the estimator variance, this approach remains unaffordable for problems where the repeated evaluation of the score function represents a too intensive computational effort. This is often the case for score functions related to the solution of a partial differential equation with random inputs.
This work proposes to alleviate computation by the parsimonious use a hierarchy of score function approximation in the CE optimization process. The score function approximation is obtained by selecting the surrogate of lowest dimensionality, whose accuracy guarantees to pass the current CE optimization stage. The selection of the surrogate relies on certified upper bounds on the error norm. An asymptotic analysis provides some theoretical guarantees on the efficiency and convergence of the proposed algorithm. Numerical results demonstrate the gain brought by the method in the context of pollution alerts and a system modeled by a PDE.
09:50
- CANCELED - Probabilistic earthquake risk assessment using deep learning
Fritz Harland Sihombing | Ulsan National Institute of Science and Technology | Korea, Republic of
Show details
Authors:
Fritz Harland Sihombing | Ulsan National Institute of Science and Technology | Korea, Republic of
Hoang Dac Nguyen | Ulsan National Institute of Science and Technology | Korea, Republic of
Marco Torbol | Ulsan National Institute of Science and Technology | Korea, Republic of
Earthquake risk assessment in the structural engineering field calculates the damage probability of building/structure given an earthquake occurrence. It is carried out by calculating the structure’s response numerically and yields a point estimate. To get the distribution of the estimate we perform the probabilistic approach by introducing randomness in the parameters, for example: geometry or/and material. This approach requires large number of simulations, known as Monte Carlo approach, which is computationally expensive and takes time. Uncertainty quantification using surrogate models offers efficiency in both cost and time given that we understand the source and property of the uncertainty in our model. We used deep learning, as our surrogate model, to estimate the structure’s response due to the earthquake. We explored its capability to capture the uncertainty when we applied the randomness in the model’s parameters. We measured the KL divergence and compared the statistical properties of Monte Carlo approach and deep learning surrogate. Lastly, we built the fragility curves of these two approaches.
10:10
Integration of reduced order modelling and multifidelity Monte Carlo simulation for efficient seismic risk assessment
Dimitrios Patsialis | University of Notre Dame | United States
Show details
Authors:
Dimitrios Patsialis | University of Notre Dame | United States
Alexandros Taflanidis | University of Notre Dame | United States
The use of high-fidelity Finite Element Models (FEMs) in seismic risk assessment applications involves a substantial computational burden when this assessment relies on nonlinear time history analysis. The authors recently developed a reduced order modeling (ROM) framework to alleviate this burden. The ROM is developed by considering only the dynamic degreed of freedom (DoF) of the original FEM and approximating the hysteretic restoring forces between these DoFs with a simplified description, calibrated through comparison of the ROM time-history response to that of the FEM for some benchmark excitations. In this work the potential bias in the ROM-based risk predictions is addressed by considering a multi-fidelity Monte Carlo (MC) simulation setting. Within this setting the low-cost response approximations of the ROM are combined with the computationally expensive response estimation offered by the high-fidelity FEM, to offer unbiased risk predictions. The number of simulations for each of the models is optimally selected to minimize the coefficient of variation of the MC estimator. Since seismic risk assessment requires simultaneous estimation for multiple quantities of interest, related to different response outputs or different thresholds describing performance, guidelines for achieving satisfactory variance reduction across all of them are discussed. Different case studies are examined to demonstrate the computational savings and accuracy improvement established.