Riemannian Adaptive Optimization Methods

10/01/2018
by   Gary Becigneul, et al.
0

Several first order stochastic optimization methods commonly used in the Euclidean domain such as stochastic gradient descent (SGD), accelerated gradient descent or variance reduced methods have already been adapted to certain Riemannian settings. However, some of the most popular of these optimization tools - namely Adam , Adagrad and the more recent Amsgrad - remain to be generalized to Riemannian manifolds. We discuss the difficulty of generalizing such adaptive schemes to the most agnostic Riemannian setting, and then provide algorithms and convergence proofs for geodesically convex objectives in the particular case of a product of Riemannian manifolds, in which adaptivity is implemented across manifolds in the cartesian product. Our generalization is tight in the sense that choosing the Euclidean space as Riemannian manifold yields the same algorithms and regret bounds as those that were already known for the standard algorithms. Experimentally, we show faster convergence and to a lower train loss value for Riemannian adaptive methods over their corresponding baselines on the realistic task of embedding the WordNet taxonomy in the Poincare ball.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/09/2019

Nonconvex stochastic optimization on manifolds via Riemannian Frank-Wolfe methods

We study stochastic projection-free methods for constrained optimization...
research
06/14/2022

The Dynamics of Riemannian Robbins-Monro Algorithms

Many important learning algorithms, such as stochastic gradient methods,...
research
08/06/2020

Curvature-Dependant Global Convergence Rates for Optimization on Manifolds of Bounded Geometry

We give curvature-dependant convergence rates for the optimization of we...
research
10/10/2022

Rieoptax: Riemannian Optimization in JAX

We present Rieoptax, an open source Python library for Riemannian optimi...
research
04/15/2021

Accelerated Optimization on Riemannian Manifolds via Discrete Constrained Variational Integrators

A variational formulation for accelerated optimization on normed spaces ...
research
01/26/2021

Statistical models and probabilistic methods on Riemannian manifolds

This entry contains the core material of my habilitation thesis, soon to...
research
11/30/2017

Riemannian Stein Variational Gradient Descent for Bayesian Inference

We develop Riemannian Stein Variational Gradient Descent (RSVGD), a Baye...

Please sign up or login with your details

Forgot password? Click here to reset