The task of processing large amounts of data in order to model complex associated dynamical systems is an important challenge of the 21st century. The need for novel mathematical concepts and advanced computational techniques in this context has accelerated research in the associated fields of Data Assimilations and Machine Learning. In recent years the two research communities have been growing closer resulting in advanced numerical methods that combine the strength of both worlds and the development of theoretical underpinning of existing and new techniques. The aim of this MS is to foster these emerging bridges, to detect limitations and possible future alleys by bringing together people from both communities and creating a room for scientific exchange.
14:00
Hierarchical Data Assimilation via Multilevel Monte Carlo
Raul F. Tempone | RWTH Aachen | Germany
Show details
Author:
Raul F. Tempone | RWTH Aachen | Germany
This work embeds a multilevel Monte Carlo (MLMC) sampling strategy into the Monte Carlo step of the ensemble Kalman filter (EnKF), thereby yielding a multilevel ensemble Kalman filter (MLEnKF) which has provably superior asymptotic cost to a given accuracy level. The development of MLEnKF for finite-dimensional state-spaces our previous work is here extended to models with infinite-dimensional state- spaces in the form of spatial fields.
This is a collaboration with A. Chernov, H. Hoel, K. Law, F. Nobile.
References:
— "Multilevel ensemble Kalman filtering for spatially extended models", by
A. Chernov, H. Hoel, K. Law, F. Nobile and R. Tempone. arXiv:1608.08558v1
— "Multilevel ensemble Kalman filtering", by H. Hoel, K. J. H. Law and R. Tempone, arXiv:1502.06069v1.
SIAM Journal of Numerical Analysis, (SINUM), 54 (2016), no. 3, 1813--1839.
14:30
Data-driven reconstruction of chaotic dynamics using data assimilation and machine learning
Marc Bocquet | Université Paris-Est | France
Show details
Authors:
Marc Bocquet | Université Paris-Est | France
Alberto Carassi | University of Reading | United Kingdom
Julien Brajard | Sorbonne University | France
Laurent Bertino | Nansen Environmental and Remote Sensing Center | Norway
Recent progress in machine learning has shown how to forecast and, to some extent, learn the dynamics of a model from observations, resorting in particular to neural networks and deep learning techniques. These approaches are often limited by the need to have a dense observational network or at least regularly subsampled observations. Our proposal is to rely on data assimilation techniques originally developed to best combine dynamical models with sparse and noisy data. We demonstrate that by combining machine learning with data assimilation techniques, it is possible to produce realistic and skilful surrogate models of the underlying dynamics given sparse and noisy observations. That same goal can be more efficiently achieved exclusively using data assimilation techniques although with much less flexibility. Assuming either locality or the use of convolutional neural networks, we show how the method can be extended to higher-dimensional dynamics. Finally, seeing the reconstruction of chaotic dynamics as an optimisation problem, we discuss the efficiency of these algorithms.
15:00
Analysis and application of the ensemble Kalman inversion
Simon Weissmann | University Mannheim | Germany
Show details
Author:
Simon Weissmann | University Mannheim | Germany
The ensemble Kalman filter (EnKF) is a widely used
methodology for data assimilation problems and has been recently
generalized to inverse problems, known as ensemble Kalman inversion
(EKI). Our theoretical results are based on the analysis of the
continuous time limit of the algorithm, i.e. a system of coupled
stochastic differential equations. We show well-posedness of the
scheme and present accuracy results of the EKI estimate. We view the
method as a derivative free optimization method for the least-squares
misfit functional, which opens up the perspective to use the method
in various areas of applications such as imaging, groundwater flow
problems, biological problems as well as in the context of the
training of neural networks. We will further discuss how machine
learning techniques can be used to accelerate the identification of
the unknown parameters via the EKI.
15:30
Comparing Frameworks for blending machine learning, physical models, and Data Assimilation Techniques
Matthew E. Levine | California Institute of Technology | United States
Show details
Author:
Matthew E. Levine | California Institute of Technology | United States
Machine learning (ML) has shown great promise in solving real-world prediction problems; however, black-box methods are unlikely to replace physics-based modeling techniques. Nevertheless, ML methods can be blended with physical models to improve overall fidelity. In particular, we find that ML can help compensate for unknown forms of model error when learning a dynamical system from data. In this setting, we compare approaches that (a) use Gaussian Process Regression to learn residuals between physics-based predictions and measurements, (b) embed solution operators to physical models within a recurrent neural network architecture, and (c) impose physics-based constraints on a deep neural network. In addition, we examine how 3DVAR update matrices can be learned online using a stochastic gradient descent paradigm, and, more broadly, how we can extend our studies of (a-c) to account for unknown model error in a 3DVAR setting.