14:00
A Grassmann Manifold-based adaptive sampling method
Prof. Michael D. Shields | Johns Hopkins University | United States
Show details
Authors:
Prof. Michael D. Shields | Johns Hopkins University | United States
Dr. Dimitrios Giovanis | Johns Hopkins University
The quantification of uncertainties (UQ) in the design process of engineering structures is an important field with growing interest. While most of modern design codes rely on partial safety factors calibrated to target structural reliability, stochastic modeling requires a simulation-based approach in which a vast number of evaluations of the response are performed in order to identify its probability law. In this work, a new adaptive stochastic simulation-based method for UQ based on observing variations in the projection of the solution on the Grassmann and Stiefel manifolds. These types of manifolds have a nonlinear geometry and as a consequence, attributes of the response that cannot be identified in Euclidean spaces become tractable. By using the singular value decomposition (SVD) for the high-dimensional solution corresponding to a realization of the input parameters with ambiguous characteristics, we obtain a point located in the Grassmann manifold and subsequently in its corresponding tangent space (i.e Stiefel manifold). In a manner similar to the simplex stochastic collocation method [1] the input parameter space is discretized into a set of simplex elements using a Delaunay triangulation. Within each element, variations of the corresponding response in the Grassmann manifold are estimated by measuring the geodesic Grassmann distances between the vertex subspaces. Elements with large variations on the Grassman manifold are subsampled and the elements refined. The same procedure is repeated for the points located now in the Stiefel manifold which, since it is a tangent space is flat and is therefore more amenable to interpolation [2].
[1] Witteveen, Jeroen AS, and Gianluca Iaccarino. "Simplex stochastic collocation with random sampling and extrapolation for nonhypercube probability spaces." SIAM Journal on Scientific Computing 34.2 (2012): A814-A838.
[2] Amsallem, David, and Charbel Farhat. "Interpolation method for adapting reduced-order models and application to aeroelasticity." AIAA journal 46.7 (2008): 1803-1813.
14:20
Modeling high-dimensional inputs with copulas for uncertainty quantification problems
Dr. Emiliano Torre | Switzerland
Show details
Authors:
Dr. Emiliano Torre | Switzerland
Dr. Stefano Marelli | ETH Zurich | Switzerland
Prof. Paul Embrechts | ETH Zurich | Switzerland
Prof. Bruno Sudret | ETH Zurich | Switzerland
In the context of uncertainty quantification (UQ) for computational
models, a probabilistic representation of the input parameters is
necessary. In most real-world settings where a data set of inputs is
available, a joint distribution needs to be inferred, which usually shows
dependence between the different parameters. Characterizing
dependencies in high dimensions may be challenging. Moreover, some
efficient techniques such as polynomial chaos expansions (PCE) require that
the input parameters are independent. In practical cases, physical input
variables are transformed into independent auxiliary variables,
e.g. through the Rosenblatt transform.
In this contribution we propose to model the joint distribution of the
input variables by vine copulas. Copulas are families of joint
probability distributions that allow one to represent dependencies among
variables separately from their marginal distributions. Vine copulas further
ease the copula estimation problem in high dimension by factorizing the
copula into conditional pair copulas of the components.
Additionally, their formulation makes it easy to derive both the
direct and inverse Rosenblatt transforms.
This property provides an efficient way to both de-correlate the input
variables (e.g. to perform PCE) and to generate space-filling sampling of the
input parameters, e.g. by back-transforming Latin Hypercube samples or
quasi-random numbers such as Sobol' sequences.
The resulting sample enables one to perform a large class of UQ analyses (e.g.
reliability analysis or propagation by PCE) at limited computational costs.
We exemplify the proposed method on previously published data from
earthquake signals. The parameters of a synthetic earthquake generator are
represented both by vine copulas and by Gaussian copula. The latter, employed
in previous publications, is used here for comparison. The synthetic earthquake
signals are then used as an input of simple mechanical oscillators and the
resulting statistics of the output displacements are compared.
14:40
Numbers or Structures: On the Futures of Structural Reliability?
Dr. Karl Breitung | TU München | Germany
Show details
Author:
Dr. Karl Breitung | TU München | Germany
In the beginning of structural the problems were mainly that a mathematical/mechanical model was given for which certain properties have to be found, in general extreme value distributions.
Over the years this remained the undisputed main topic: Given a model, produce an estimate for failure probabilities.
There were different approaches. In Monte Carlo based procedures it was tried to improve the crude simulation by refined concepts as importance sampling or subset simulation.
In FORM/SORM concepts one starts from introducing a simplified geometric structure approximating the limit state domain and the initial estimate obtained from this region is refined by
importance sampling or response surfaces. Often both concepts were mixed to hybrid methods.
Now, one sees that the problems in structural reliability are changing.
Problems are involving high dimensional spaces and limit state functions which are outputs of finite element packages. So there is often no more a well defined mathematical model, but the functions values are coming from something like a black box.
So, with a simple underlying structure lost, is it still the main task of structural reliability to produce numbers ? Input quantities for the FEM-programs are often uncertain, so the obtained failure probabilities are wrong anyway.
Plato said that decisions should be based on knowledge, not on numbers. How to find knowledge instead of numbers?
This can be achieved studying the geometry of regions of the limit state surface where the failure events are most likely, i.e. near the beta points and the random distributions of the output quantities derived from those of the input quantities. This should result in finding structures which give information about the causes of failure.
So it might be worth a consideration to shift the focus in structural reliability from number crunching towards the detection and study of the geometric and probabilistic structures responsible for failures.
15:00
Quantification of Uncertainty Resulting from Microstructure Morphology Variation Based on Statistically Similar Representative Volume Elements
Niklas Miska | TU Dresden
Show details
Authors:
Niklas Miska | TU Dresden
PhD Stefan Prüger | TU Dresden
Prof. Daniel Balzani | TU Dresden; Dresden Center for Computational Materials Science (DCMS)
Improved design requirements led to the demand of materials that combine advantageous properties such as high strength and high ductility. This combination can be achieved by making use of pronounced microstructures. As the microstructure morphology is subjected to unavoidable variation, the macroscopic behavior is uncertain. Hence, when e.g. computing the fail-safety of structures made of those modern materials, it is reasonable to quantify the microstructure-based uncertainty. Instead of carrying out a large number of tests with specimens of the material for the quantification of the uncertainty we propose a numerical approach by exploiting the concept of Statistically Similar Representative Volume Elements (SSRVEs), cf.[1]. SSRVEs consider artificial microstructure morphologies, which are fitted to the real microstructure in the sense of chosen higher-order statistical measures. By defining bounds on these statistical measures and considering all SSRVEs within these bounds it is possible to characterize the variation of the microstructure morphology. The resulting set of SSRVEs is used to perform a Monte-Carlo calculation in terms of Finite Elements to obtain homogenized properties of the material on the macroscale. Based on the resulting homogenized properties statistics of the macroscopic material parameters are quantified. Their statistical moments are dependent from the quantity of available and considered microstructure data and their statistical distribution. The Monte-Carlo computation is automatized with the application of an extended Finite Cell Method (FCM), cf.[2], which allows the use of non-conforming meshes for each of the considered SSRVEs. The proposed method is demonstrated for an advanced high strength steel.
15:20
Multi-scale failure analysis with polymorphic uncertainties for optimal design of rotor blades
Robert Gruhlke | WIAS Berlin | Germany
Show details
Author:
Robert Gruhlke | WIAS Berlin | Germany
Wind turbine blades are thin-walled spatial structures typically consisting of two composite
shells and one or two shear webs assembled with adhesive bonds. Full-size mechanical tests of rotor blades are mandatory for certification but very costly.
The definition of representative sub-components typically involves expert knowledge on one hand and is impeded by limited information on specific physical parameters on the other hand, leading to polymorphic uncertainties.
As an important example of a sub-component, the Henkel beam has been developed
for testing adhesive bonds, which play a key role in structural integrity and reliability of rotor blades. Small defects, i.e. voids and delaminations, are common in the bond lines due to manufacturing and application process and can cause multiple tensile cracks and thus lead to macroscopic separation between spar cap and shear web.
Applying a transformation of stochastic microstructure to reference configuration, we build up a uncertain microscopic model in a UQ setting, that leads to a high dimensional problem.
By numerical upscaling we construct a statistical surrogate model using modern reduction methods, adaptivity and low-rank compression via hierachical tensor representation to overcome the curse of dimensionality. The structure of voids, encoded in oscillating parameter coefficients, is then resolved via multiscale FEM.