DeepAI AI Chat
Log In Sign Up

Optimization on manifolds: A symplectic approach

by   Guilherme França, et al.
University of Cambridge

There has been great interest in using tools from dynamical systems and numerical analysis of differential equations to understand and construct new optimization methods. In particular, recently a new paradigm has emerged that applies ideas from mechanics and geometric integration to obtain accelerated optimization methods on Euclidean spaces. This has important consequences given that accelerated methods are the workhorses behind many machine learning applications. In this paper we build upon these advances and propose a framework for dissipative and constrained Hamiltonian systems that is suitable for solving optimization problems on arbitrary smooth manifolds. Importantly, this allows us to leverage the well-established theory of symplectic integration to derive "rate-matching" dissipative integrators. This brings a new perspective to optimization on manifolds whereby convergence guarantees follow by construction from classical arguments in symplectic geometry and backward error analysis. Moreover, we construct two dissipative generalizations of leapfrog that are straightforward to implement: one for Lie groups and homogeneous spaces, that relies on the tractable geodesic flow or a retraction thereof, and the other for constrained submanifolds that is based on a dissipative generalization of the famous RATTLE integrator.


page 1

page 2

page 3

page 4


Accelerated Optimization on Riemannian Manifolds via Discrete Constrained Variational Integrators

A variational formulation for accelerated optimization on normed spaces ...

Accelerated Optimization in the PDE Framework: Formulations for the Manifold of Diffeomorphisms

We consider the problem of optimization of cost functionals on the infin...

Conformal Symplectic and Relativistic Optimization

Although momentum-based optimization methods have had a remarkable impac...

A Discrete Variational Derivation of Accelerated Methods in Optimization

Many of the new developments in machine learning are connected with grad...

Hamiltonian Monte Carlo on Symmetric and Homogeneous Spaces via Symplectic Reduction

The Hamiltonian Monte Carlo method generates samples by introducing a me...

Neural Ordinary Differential Equations on Manifolds

Normalizing flows are a powerful technique for obtaining reparameterizab...

New Step-Size Criterion for the Steepest Descent based on Geometric Numerical Integration

This paper deals with unconstrained optimization problems based on numer...