Generalizing Adam To Manifolds For Efficiently Training Transformers

05/26/2023
by   Benedikt Brantner, et al.
0

One of the primary reasons behind the success of neural networks has been the emergence of an array of new, highly-successful optimizers, perhaps most importantly the Adam optimizer. It is wiedely used for training neural networks, yet notoriously hard to interpret. Lacking a clear physical intuition, Adam is difficult to generalize to manifolds. Some attempts have been made to directly apply parts of the Adam algorithm to manifolds or to find an underlying structure, but a full generalization has remained elusive. In this work a new approach is presented that leverages the special structure of the manifolds which are relevant for optimization of neural networks, such as the Stiefel manifold, the symplectic Stiefel manifold, the Grassmann manifold and the symplectic Grassmann manifold: all of these are homogeneous spaces and as such admit a global tangent space representation. This global tangent space representation is used to perform all of the steps in the Adam optimizer. The resulting algorithm is then applied to train a transformer for which orthogonality constraints are enforced up to machine precision and we observe significant speed-ups in the training process. Optimization of neural networks where they weights do not lie on a manifold is identified as a special case of the presented framkework. This allows for a flexible implementation in which the learning rate is adapted simultaneously for all parameters, irrespective of whether they are an element of a general manifold or a vector space.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/03/2009

Isometric Multi-Manifolds Learning

Isometric feature mapping (Isomap) is a promising manifold learning meth...
research
12/29/2020

Data driven Dirichlet sampling on manifolds

This article presents a novel method to sampling on manifolds based on t...
research
05/14/2018

H-CNNs: Convolutional Neural Networks for Riemannian Homogeneous Spaces

Convolutional neural networks are ubiquitous in Machine Learning applica...
research
10/03/2018

McTorch, a manifold optimization library for deep learning

In this paper, we introduce McTorch, a manifold optimization library for...
research
04/14/2022

Learning Invariances with Generalised Input-Convex Neural Networks

Considering smooth mappings from input vectors to continuous targets, ou...
research
05/26/2020

How to see the eight Thurston geometries

A manifold is a topological space that is locally Euclidean. Manifolds a...
research
05/02/2023

The Training Process of Many Deep Networks Explores the Same Low-Dimensional Manifold

We develop information-geometric techniques to analyze the trajectories ...

Please sign up or login with your details

Forgot password? Click here to reset