Stochastic gradient descent on Riemannian manifolds

11/22/2011
by   Silvère Bonnabel, et al.
0

Stochastic gradient descent is a simple approach to find the local minima of a cost function whose evaluations are corrupted by noise. In this paper, we develop a procedure extending stochastic gradient descent algorithms to the case where the function is defined on a Riemannian manifold. We prove that, as in the Euclidian case, the gradient descent algorithm converges to a critical point of the cost function. The algorithm has numerous potential applications, and is illustrated here by four examples. In particular a novel gossip algorithm on the set of covariance matrices is derived and tested numerically.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/09/2023

Asymptotically efficient one-step stochastic gradient descent

A generic, fast and asymptotically efficient method for parametric estim...
research
02/26/2018

Averaging Stochastic Gradient Descent on Riemannian Manifolds

We consider the minimization of a function defined on a Riemannian manif...
research
03/24/2022

Local optimisation of Nyström samples through stochastic gradient descent

We study a relaxed version of the column-sampling problem for the Nyströ...
research
12/19/2019

Central limit theorems for stochastic gradient descent with averaging for stable manifolds

In this article we establish new central limit theorems for Ruppert-Poly...
research
12/29/2021

Nonconvex Stochastic Scaled-Gradient Descent and Generalized Eigenvector Problems

Motivated by the problem of online canonical correlation analysis, we pr...
research
06/17/2018

Laplacian Smoothing Gradient Descent

We propose a very simple modification of gradient descent and stochastic...
research
02/22/2022

Convergence of online k-means

We prove asymptotic convergence for a general class of k-means algorithm...

Please sign up or login with your details

Forgot password? Click here to reset