[ Moved from MW HS 2235 ]
Data-science and numerical simulation are moving rapidly toward a workflow based approach for complex multiscale or multiphysics problems, which better suits the many-tasks paradigm followed by HPC centers on the path to exascale. As a result, a wide range of tools and frameworks (both generic and domain specific) have been developed over the years in order to support scientists in designing, implementing and running their complex simulations and workflows efficiently on HPC systems.
In order to produce "actionable" results, these simulations and workflows need to be validated, verified and equipped with uncertainty quantification (VVUQ) such that their output may be relied upon when making important decisions in various domains. The VECMA project (https://www.vecma.eu) aims at developing an open source toolkit (https://www.vecma-toolkit.eu) to ease, and automate where possible, the addition of VVUQ into such multiscale or multiphysics simulations.
In this minisymposium we invite developers of this toolkit to present its most recent version, and researchers in various domains (Fusion, Materials, Climate, Bio-medicine, etc...) to present how it can be integrated into existing applications in order to add VVUQ capabilities.
16:30
Introduction to the VECMA toolkit
Derek Groen | Brunel University London | United Kingdom
Show details
Authors:
Derek Groen | Brunel University London | United Kingdom
Hamid Arabnejad | Brunel University London | United Kingdom
Diana Suleimenova | Brunel University London | United Kingdom
Multiscale simulations are an essential computational method in a range of research disciplines, and provide unprecedented levels of scientific insight at a tractable cost in terms of effort and compute resources. To provide this, we need such simulations to produce results that are both robust and actionable. The VECMA toolkit (VECMAtk), which was first released in June 2019, establishes a platform to achieve this by exposing patterns for verification, validation and uncertainty quantification (VVUQ). These patterns can be combined to capture complex scenarios, applied to applications in disparate domains, and used to run multiscale simulations on any desktop, cluster or supercomputing platform.
17:00
Verification, Validation & Uncertainty Quantification for Molecular Dynamics Simulation
Shunzhou Wan | University College London | United Kingdom
Show details
Authors:
Peter Coveney | University College London | United Kingdom
Robin Richardson | University College London | United Kingdom
David W Wright | University College London | United Kingdom
Shunzhou Wan | University College London | United Kingdom
Molecular dynamics is a modelling and simulation method encountered in a very wide range of scientific disciplines, from physics, chemistry and materials to engineering, biology and medicine. Uncertainty arises from a combination of systematic and random sources. To get a firm handle on the former, one needs to understand and control the latter, which arise owing to the inherently chaotic nature of the trajectories that arise. To deal with this, one usually seeks to use a combination of temporal and ensemble averaging, so as to generate statistical averages which are robust for the purposes of verification and validation.
The present talk describes the use of the VECMA Toolkit to facilitate verification, validation and uncertainty quantification for some examples of these kinds of simulations. Applications include predictions of advanced materials properties, ligand-protein binding affinities, mutation rates in DNA and the behaviour of DNA nanopores. Recent work indicates that, owing to a new pathology we have uncovered in the IEEE floating point numbers, the statistical averages computed in these calculations themselves contain systematic errors, whose existence was hitherto unknown. We conclude by discussing ways in which such additional errors can be quantified and reported.
17:30
VVUQ tools applied to fusion multiscale workflow simulations
Jalal Lakhlili | Max Planck Institute for Plasma Physics | Germany
Show details
Authors:
Jalal Lakhlili | Max Planck Institute for Plasma Physics | Germany
David Coster | Max Planck Institute for Plasma Physics | Germany
Olivier Hoenen | Max Planck Institute for Plasma Physics | Germany
Onnie O Luk | Max Planck Institute for Plasma Physics | Germany
Bruce D Scott | Max Planck Institute for Plasma Physics | Germany
Roland Preuss | Max Planck Institute for Plasma Physics | Germany
Udo von Toussaint | Max Planck Institute for Plasma Physics | Germany
The prediction of fusion plasma dynamics is in general very complex. One aspect of this complexity arises from the multiscale nature of the plasma in which microscale turbulence has effects on the global transport at the size of the fusion device. To study such multiscale phenomena, a workflow is designed by bringing together several single-scale codes for equilibrium, transport and turbulence modelling and allowing the study of impacts of propagated turbulent noise on the overall plasma transport.
The goal is to produce temperature and density profiles, along with their confidence intervals, arising from the propagation of uncertainties coming from one single-scale to the rest of the workflow. This allows for an improved validation of the simulations since the experimental results come with uncertainties as well. We created a workflow which incorporates uncertainty quantification and sensitivity analysis through the EasyVVUQ tool, using a Pilot-Job mechanism to make use of the high level of parallelism inherent in the UQ workflow.
18:00
Deriving reduced subgrid scale models from data
Wouter Edeling | Centrum Wiskunde & Informatica | Netherlands
Show details
Authors:
Wouter Edeling | Centrum Wiskunde & Informatica | Netherlands
Daan Crommelin | Centrum Wiskunde & Informatica | Netherlands
Recent years have seen a growing interest in using data-driven (machine-learning) techniques for the construction of cheap surrogate models of turbulent subgrid scale stresses. These stresses display complex spatio-temporal structures, and constitute a difficult surrogate target. We propose a data-preprocessing step in which we derive alternative subgrid scale models that are virtually exact for a user-specified set of spatially integrated quantities of interest. The unclosed component of these new subgrid scale models is of the same size as this set of integrated quantities of interest. As a result, the corresponding training data is massively reduced in size, decreasing the complexity of the subsequent surrogate construction. The data preprocessing step can be generalized and be made part of the VECMA toolkit, in addition to several surrogate methodologies. We demonstrate our approach on the two-dimensional forced-dissipative vorticity equations.