Gradient descent algorithms for Bures-Wasserstein barycenters

01/06/2020
by   Sinho Chewi, et al.
0

We study first order methods to compute the barycenter of a probability distribution over the Bures-Wasserstein manifold. We derive global rates of convergence for both gradient descent and stochastic gradient descent despite the fact that the barycenter functional is not geodesically convex. Our analysis overcomes this technical hurdle by developing a Polyak-Lojasiewicz (PL) inequality, which is built using tools from optimal transport and metric geometry.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/03/2020

SVGD as a kernelized Wasserstein gradient flow of the chi-squared divergence

Stein Variational Gradient Descent (SVGD), a popular sampling algorithm,...
research
06/29/2020

Natural Gradient for Combined Loss Using Wavelets

Natural gradients have been widely used in optimization of loss function...
research
03/01/2021

Information-geometry of physics-informed statistical manifolds and its use in data assimilation

The data-aware method of distributions (DA-MD) is a low-dimension data a...
research
02/09/2021

Berry–Esseen Bounds for Multivariate Nonlinear Statistics with Applications to M-estimators and Stochastic Gradient Descent Algorithms

We establish a Berry–Esseen bound for general multivariate nonlinear sta...
research
07/11/2023

Measure transfer via stochastic slicing and matching

This paper studies iterative schemes for measure transfer and approximat...
research
12/04/2019

Exponential convergence of Sobolev gradient descent for a class of nonlinear eigenproblems

We propose to use the Łojasiewicz inequality as a general tool for analy...
research
06/15/2021

Non-asymptotic convergence bounds for Wasserstein approximation using point clouds

Several issues in machine learning and inverse problems require to gener...

Please sign up or login with your details

Forgot password? Click here to reset