Operator-splitting schemes for degenerate conservative-dissipative systems

05/24/2021
by   Daniel Adams, et al.
0

The theory of Wasserstein gradient flows in the space of probability measures provides a powerful framework to study dissipative partial differential equations (PDE). It can be used to prove well-posedness, regularity, stability and quantitative convergence to the equilibrium. However, many PDE are not gradient flows, and hence the theory is not immediately applicable. In this work we develop a straightforward entropy regularised splitting scheme for degenerate non-local non-gradient systems. The approach is composed of two main stages: first we split the dynamics into the conservative and dissipative forces, secondly we perturb the problem so that the diffusion is no longer singular and perform a weighted Wasserstein “JKO type” descent step. Entropic regularisation of optimal transport problems opens the way for efficient numerical methods for solving these gradient flows. We illustrate the generality of our work by providing a number of examples, including the Regularized Vlasov-Poisson-Fokker-Planck equation, to which our results applicable.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/01/2021

Large-Scale Wasserstein Gradient Flows

Wasserstein gradient flows provide a powerful means of understanding and...
research
11/16/2020

The back-and-forth method for Wasserstein gradient flows

We present a method to efficiently compute Wasserstein gradient flows. O...
research
02/23/2018

Langevin Monte Carlo and JKO splitting

Algorithms based on discretizing Langevin diffusion are popular tools fo...
research
05/10/2021

Scalar auxiliary variable approach for conservative/dissipative partial differential equations with unbounded energy

In this paper, we present a novel investigation of the so-called SAV app...
research
09/30/2018

Accelerated PDE's for efficient solution of regularized inversion problems

We further develop a new framework, called PDE Acceleration, by applying...
research
05/24/2022

Data driven gradient flows

We present a framework enabling variational data assimilation for gradie...
research
06/01/2021

Optimizing Functionals on the Space of Probabilities with Input Convex Neural Networks

Gradient flows are a powerful tool for optimizing functionals in general...

Please sign up or login with your details

Forgot password? Click here to reset