OpenTURNS is an open source library for uncertainty propagation by probabilistic methods. Developed by a partnership of five industrial companies (EDF, Airbus, Phimeca, IMACS and ONERA), it benefits from a strong practical feedback. Classical algorithms of UQ are available: central dispersion, probability of exceedance, sensitivity analysis, metamodels and stochastic processes. Developed in C++, OpenTURNS is also available as a Python module and has gained maturity thanks to more than 10 years of development. The goal of this minisymposium is to gather the OpenTURNS community and get an overview of the trends within the software, the associated research topics and its industrial uses.
14:00
Overview of OpenTURNS, its new features and its graphical user interface
Michaël Baudin | EDF | France
Show details
Authors:
Michaël Baudin | EDF | France
Anne Dutfoy | EDF R&D | France
Anthony Geay | EDF R&D | France
Ovidiu Mircescu | EDF R&D | France
Aurélie Ladier | Phiméca Engineering | France
Julien Schueller | Phiméca Engineering | France
Antoine Dumas | Phiméca Engineering | France
Thibault Delage | EDF | France
The statistical characterization of the output of a system subject to uncertain inputs is the core of OpenTURNS, an open source library for uncertainty propagation by probabilistic methods. In the first part, the new features of the library will be presented, which include sequential statistical estimators for the mean and the estimation of Sobol' indices based on asymptotics statistics. We will also review the new calibration algorithms which are based on least squares and Bayesian MAP estimators. In the second part, we review the new features of the open source GUI PERSALYS that allow to take into account statistical dependencies based on copulas (and to estimate its parameters). Moreover, we will present how the GUI can propagate uncertainties through a one-dimensional field function and provide interactive graphical tools using Paraview and the High Density Region dimension reduction algorithm.
14:30
Uncertainties in Civil Engineering: a FE-model calibration process for Cultural Heritage applications
Vladimir Cerisano Kovačević | Kobe Innovation Engineering and University of Florence | Italy
Show details
Authors:
Vladimir Cerisano Kovačević | Kobe Innovation Engineering and University of Florence | Italy
Michele Betti | Kobe Innovation Engineering and University of Florence | Italy
Since the assessment of the vulnerability of existing and/or historical structures requires the support of numerical models (usually adopting the FE technique), the quantification/reduction of the effects introduced by the unavoidable uncertainties is essential in order to provide effective computational predictions of their structural behavior. Uncertainties affecting civil structures can be grouped in two main categories: aleatory and epistemic. While aleatory uncertainties are typically irreducible (because they are induced by the randomness of physical phenomena), the epistemic uncertainties are potentially reducible since they are due to a lack of knowledge and roughness when modeling the overall physical environment. In this respect, experimental tests can offer a big support: the outcome of the numerical analyses should match the observed condition or the testing data that can be retrieved on the actual structure. In this paper, a workflow developed inside an open source environment, based on EDF’s salome_meca, code_aster, OpenTURNS and Python libraries, is presented with the aim to describe the finite element model calibration. This calibration is fundamental in existing structures and represents a first assessment of the digital twin which should be able to replicate the current condition of the structure and to predict its behavior in extreme conditions. Some illustrative case studies in the field of Cultural Heritage Structures are eventually discussed.
15:00
Using OpenTURNS for surrogate modeling in the context of uncertainty reduction and data assimilation for 1D and 2D hydrodynamics
Sophie Ricci | CECI, CERFACS/CNRS-5318 | France
Show details
Authors:
Sophie Ricci | CECI, CERFACS/CNRS-5318 | France
Matthias De Lozzo | Institut Recherche et Technologie Saint-Exupéry | France
Siham El Garroussi | CECI, CERFACS/CNRS-5318 | France
Nicole Goutal | EDF R&D | France
Didier Lucor | Laboratoire d'Informatique pour la Mécanique et les Sciences de l'Ingénieur | France
Isabelle Mirouze | CERFACS | France
River hydraulic models are used for environmental risk assessment associated with flooding and consequently, to inform decision support systems for civil security needs. Uncertainties in model input variables such as friction, hydrologic inflows and bathymetry translate into uncertainties in model output quantities of interest such as water level, discharge and velocity, thus limiting the use of a deterministic approach. An ensemble-based approach is often favored as it provides a statistical description of the hydraulic variables, given some statistical assumptions on the inputs. Identifying and quantifying the major sources of uncertainties is achieved with a sensitivity analysis that relies on the classical Monte Carlo approach, known for its robustness but slow convergence and large associated computational cost. This cost is significantly reduced when surrogate models are used, without loss of accuracy. The merits of this strategy were shown in previous studies using OpenTURNS for statistical computations, metamodeling (polynomial chaos expansion and kriging). When the input-output relationship presents discontinuities, the polynomial chaos surrogate models fail to provide a satisfying solution. More advanced solutions should be investigated such as mixture of experts. Finally, the uncertainty quantification ensemble framework is naturally compatible with ensemble-based data assimilation methods such as Ensemble Kalman filter that aim at reducing the uncertainty.
15:30
Generating random waveforms for stochastic tsunami simulations
Luca Arpaia | BRGM | France
Show details
Authors:
Luca Arpaia | BRGM | France
Jeremy Rohmer | BRGM | France
Anne Lemoine | BRGM | France
Rodrigo Pedreros | BRGM | France
Numerous near-field earthquake-induced tsunami simulations have shown that the spatial distribution of the fault slip can have a significant impact on the nearshore wave propagation and ultimately on the wave height in some sensitive areas. However fault rupture is a complex phenomenon which depends on pre-rupture stress condition, geometrical setting and frictional property of the fault, which are largely unknown. In the present study, we model the slip spatial field on the fault as a random field [Mai and Beroza, 2002]. Instead of resorting to the classical Gaussian assumption, random slip patterns are modelled using a joint lognormal distribution to account for the fact that the probability of negative values is zero. The Karhunen-Loeve (KL) expansion is used to generate the slip patterns [Gonzalez et al., 2016] while the free surface ocean displacement is computed by means of the Okada solution of an elastic dislocation problem. Finally, wave propagation is performed by solving numerically the non-linear shallow water equations. The interest is to compactly represent the mapping between a variable of interest at coast Y and the vector of the input random variables (the coefficients of the KL expansion) through the generalized Polynomial Chaos Expansion (gPCE) implemented in the Openturns software [Baudin et al., 2015]. The gPCE surrogate is tested on idealized one-dimensional benchmarks as well as on the historical event of the 1755 Lisbon tsunami.