Natural gradient via optimal transport I

03/16/2018
by   Wuchen Li, et al.
1

We study a natural Wasserstein gradient flow on manifolds of probability distributions with discrete sample spaces. We derive the Riemannian structure for the probability simplex from the dynamical formulation of the Wasserstein distance on a weighted graph. We pull back the geometric structure to the parameter space of any given probability model, which allows us to define a natural gradient flow there. In contrast to the natural Fisher-Rao gradient, the natural Wasserstein gradient incorporates a ground metric on sample space. We discuss implementations following the forward and backward Euler methods. We illustrate the analysis on elementary exponential family examples.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/22/2018

Natural gradient in Wasserstein statistical manifold

We study the Wasserstein natural gradient in parametric statistical mode...
research
10/21/2019

Kernelized Wasserstein Natural Gradient

Many machine learning problems can be expressed as the optimization of s...
research
07/31/2023

Wasserstein Mirror Gradient Flow as the limit of the Sinkhorn Algorithm

We prove that the sequence of marginals obtained from the iterations of ...
research
10/15/2020

Geometry of Sample Spaces

In statistics, independent, identically distributed random samples do no...
research
04/13/2023

A Natural Copula

Copulas are widely used in financial economics as well as in other areas...
research
06/26/2020

Semi-discrete optimization through semi-discrete optimal transport: a framework for neural architecture search

In this paper we introduce a theoretical framework for semi-discrete opt...
research
10/10/2019

Gromov-Wasserstein Averaging in a Riemannian Framework

We introduce a theoretical framework for performing statistical tasks—in...

Please sign up or login with your details

Forgot password? Click here to reset