DeepAI AI Chat
Log In Sign Up

The back-and-forth method for Wasserstein gradient flows

11/16/2020
by   Matt Jacobs, et al.
0

We present a method to efficiently compute Wasserstein gradient flows. Our approach is based on a generalization of the back-and-forth method (BFM) introduced by Jacobs and Léger to solve optimal transport problems. We evolve the gradient flow by solving the dual problem to the JKO scheme. In general, the dual problem is much better behaved than the primal problem. This allows us to efficiently run large-scale simulations for a large class of internal energies including singular and non-convex energies.

READ FULL TEXT

page 27

page 28

page 29

page 30

page 31

page 32

12/04/2021

Variational Wasserstein gradient flow

The gradient flow of a function over the space of probability densities ...
01/11/2023

Wasserstein Gradient Flows of the Discrepancy with Distance Kernel on the Line

This paper provides results on Wasserstein gradient flows between measur...
05/24/2021

Operator-splitting schemes for degenerate conservative-dissipative systems

The theory of Wasserstein gradient flows in the space of probability mea...
11/30/2022

Taming Hyperparameter Tuning in Continuous Normalizing Flows Using the JKO Scheme

A normalizing flow (NF) is a mapping that transforms a chosen probabilit...
12/09/2020

Dual perspective method for solving the point in a polygon problem

A novel method has been introduced to solve a point inclusion in a polyg...
06/10/2020

Learning normalizing flows from Entropy-Kantorovich potentials

We approach the problem of learning continuous normalizing flows from a ...
02/18/2020

A Wasserstein Minimum Velocity Approach to Learning Unnormalized Models

Score matching provides an effective approach to learning flexible unnor...