Accelerated Riemannian Optimization: Handling Constraints with a Prox to Bound Geometric Penalties

11/26/2022
by   David Martinez Rubio, et al.
0

We propose a globally-accelerated, first-order method for the optimization of smooth and (strongly or not) geodesically-convex functions in a wide class of Hadamard manifolds. We achieve the same convergence rates as Nesterov's accelerated gradient descent, up to a multiplicative geometric penalty and log factors. Crucially, we can enforce our method to stay within a compact set we define. Prior fully accelerated works resort to assuming that the iterates of their algorithms stay in some pre-specified compact set, except for two previous methods of limited applicability. For our manifolds, this solves the open question in [KY22] about obtaining global general acceleration without iterates assumptively staying in the feasible set.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset