Information-geometry of physics-informed statistical manifolds and its use in data assimilation

03/01/2021
by   Francesca Boso, et al.
0

The data-aware method of distributions (DA-MD) is a low-dimension data assimilation procedure to forecast the behavior of dynamical systems described by differential equations. It combines sequential Bayesian update with the MD, such that the former utilizes available observations while the latter propagates the (joint) probability distribution of the uncertain system state(s). The core of DA-MD is the minimization of a distance between an observation and a prediction in distributional terms, with prior and posterior distributions constrained on a statistical manifold defined by the MD. We leverage the information-geometric properties of the statistical manifold to reduce predictive uncertainty via data assimilation. Specifically, we exploit the information geometric structures induced by two discrepancy metrics, the Kullback-Leibler divergence and the Wasserstein distance, which explicitly yield natural gradient descent. To further accelerate optimization, we build a deep neural network as a surrogate model for the MD that enables automatic differentiation. The manifold's geometry is quantified without sampling, yielding an accurate approximation of the gradient descent direction. Our numerical experiments demonstrate that accounting for the information-geometry of the manifold significantly reduces the computational cost of data assimilation by facilitating the calculation of gradients and by reducing the number of required iterations. Both storage needs and computational cost depend on the dimensionality of a statistical manifold, which is typically small by MD construction. When convergence is achieved, the Kullback-Leibler and L_2 Wasserstein metrics have similar performances, with the former being more sensitive to poor choices of the prior.

READ FULL TEXT

page 10

page 12

research
01/06/2020

Gradient descent algorithms for Bures-Wasserstein barycenters

We study first order methods to compute the barycenter of a probability ...
research
03/06/2020

Wasserstein statistics in 1D location-scale model

Wasserstein geometry and information geometry are two important structur...
research
08/04/2019

Hopfield Neural Network Flow: A Geometric Viewpoint

We provide gradient flow interpretations for the continuous-time continu...
research
02/13/2020

Solution manifold and Its Statistical Applications

A solution manifold is the collection of points in a d-dimensional space...
research
04/06/2023

Interpretable statistical representations of neural population dynamics and geometry

The dynamics of neuron populations during diverse tasks often evolve on ...
research
06/26/2021

A Graph-based approach to derive the geodesic distance on Statistical manifolds: Application to Multimedia Information Retrieval

In this paper, we leverage the properties of non-Euclidean Geometry to d...
research
10/12/2022

Sampling in Constrained Domains with Orthogonal-Space Variational Gradient Descent

Sampling methods, as important inference and learning techniques, are ty...

Please sign up or login with your details

Forgot password? Click here to reset