Negative curvature obstructs acceleration for geodesically convex optimization, even with exact first-order oracles

11/25/2021
by   Christopher Criscitiello, et al.
0

Hamilton and Moitra (2021) showed that, in certain regimes, it is not possible to accelerate Riemannian gradient descent in the hyperbolic plane if we restrict ourselves to algorithms which make queries in a (large) bounded domain and which receive gradients and function values corrupted by a (small) amount of noise. We show that acceleration remains unachievable for any deterministic algorithm which receives exact gradient and function-value information (unbounded queries, no noise). Our results hold for the classes of strongly and nonstrongly geodesically convex functions, and for a large class of Hadamard manifolds including hyperbolic spaces and the symmetric space SL(n) / SO(n) of positive definite n × n matrices of determinant one. This cements a surprising gap between the complexity of convex optimization and geodesically convex optimization: for hyperbolic spaces, Riemannian gradient descent is optimal on the class of smooth and and strongly geodesically convex functions, in the regime where the condition number scales with the radius of the optimization domain. The key idea for proving the lower bound consists of perturbing the hard functions of Hamilton and Moitra (2021) with sums of bump functions chosen by a resisting oracle.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/07/2020

Acceleration in Hyperbolic and Spherical Spaces

We further research on the acceleration phenomenon on Riemannian manifol...
research
01/14/2021

No-go Theorem for Acceleration in the Hyperbolic Plane

In recent years there has been significant effort to adapt the key tools...
research
06/05/2023

Curvature and complexity: Better lower bounds for geodesically convex optimization

We study the query complexity of geodesically convex (g-convex) optimiza...
research
06/25/2019

Complexity of Highly Parallel Non-Smooth Convex Optimization

A landmark result of non-smooth convex optimization is that gradient des...
research
07/24/2023

Open Problem: Polynomial linearly-convergent method for geodesically convex optimization?

Let f ℳ→ℝ be a Lipschitz and geodesically convex function defined on a d...
research
06/16/2023

Memory-Constrained Algorithms for Convex Optimization via Recursive Cutting-Planes

We propose a family of recursive cutting-plane algorithms to solve feasi...
research
04/28/2018

A Riemannian Corollary of Helly's Theorem

We introduce a notion of halfspace for Hadamard manifolds that is natura...

Please sign up or login with your details

Forgot password? Click here to reset