Trivializations for Gradient-Based Optimization on Manifolds

09/20/2019
by   Mario Lezcano Casado, et al.
0

We introduce a framework to study the transformation of problems with manifold constraints into unconstrained problems through parametrizations in terms of a Euclidean space. We call these parametrizations "trivializations". We prove conditions under which a trivialization is sound in the context of gradient-based optimization and we show how two large families of trivializations have overall favorable properties, but also suffer from a performance issue. We then introduce "dynamic trivializations", which solve this problem, and we show how these form a family of optimization methods that lie between trivializations and Riemannian gradient descent, and combine the benefits of both of them. We then show how to implement these two families of trivializations in practice for different matrix manifolds. To this end, we prove a formula for the gradient of the exponential of matrices, which can be of practical interest on its own. Finally, we show how dynamic trivializations improve the performance of existing methods on standard tasks designed to test long-term memory within neural networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/30/2018

optimParallel: an R Package Providing Parallel Versions of the Gradient-Based Optimization Methods of optim()

The R package optimParallel provides a parallel version of the gradient-...
research
10/09/2020

Adaptive and Momentum Methods on Manifolds Through Trivializations

Adaptive methods do not have a direct generalization to manifolds as the...
research
11/17/2022

Optimization on the symplectic Stiefel manifold: SR decomposition-based retraction and applications

Numerous problems in optics, quantum physics, stability analysis, and co...
research
11/30/2017

Riemannian Stein Variational Gradient Descent for Bayesian Inference

We develop Riemannian Stein Variational Gradient Descent (RSVGD), a Baye...
research
06/15/2023

Optimization on product manifolds under a preconditioned metric

Since optimization on Riemannian manifolds relies on the chosen metric, ...
research
05/21/2022

Symmetry Teleportation for Accelerated Optimization

Existing gradient-based optimization methods update the parameters local...
research
06/24/2020

Randomized Block-Diagonal Preconditioning for Parallel Learning

We study preconditioned gradient-based optimization methods where the pr...

Please sign up or login with your details

Forgot password? Click here to reset