Unconstrained optimisation on Riemannian manifolds

08/25/2020
by   Tuyen Trung Truong, et al.
0

In this paper, we give explicit descriptions of versions of (Local-) Backtracking Gradient Descent and New Q-Newton's method to the Riemannian setting.Here are some easy to state consequences of results in this paper, where X is a general Riemannian manifold of finite dimension and f:X→ℝ a C^2 function which is Morse (that is, all its critical points are non-degenerate). Theorem. For random choices of the hyperparameters in the Riemanian Local Backtracking Gradient Descent algorithm and for random choices of the initial point x_0, the sequence {x_n} constructed by the algorithm either (i) converges to a local minimum of f or (ii) eventually leaves every compact subsets of X (in other words, diverges to infinity on X). If f has compact sublevels, then only the former alternative happens. The convergence rate is the same as in the classical paper by Armijo. Theorem. Assume that f is C^3. For random choices of the hyperparametes in the Riemannian New Q-Newton's method, if the sequence constructed by the algorithm converges, then the limit is a critical point of f. We have a local Stable-Center manifold theorem, near saddle points of f, for the dynamical system associated to the algorithm. If the limit point is a non-degenerate minimum point, then the rate of convergence is quadratic. If moreover X is an open subset of a Lie group and the initial point x_0 is chosen randomly, then we can globally avoid saddle points. As an application, we propose a general method using Riemannian Backtracking GD to find minimum of a function on a bounded ball in a Euclidean space, and do explicit calculations for calculating the smallest eigenvalue of a symmetric square matrix.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/18/2019

Escaping from saddle points on Riemannian manifolds

We consider minimizing a nonconvex, smooth function f on a Riemannian ma...
research
11/11/2019

Convergence to minima for the continuous version of Backtracking Gradient Descent

The main result of this paper is: Theorem. Let f:R^k→R be a C^1 funct...
research
09/09/2021

Function recovery on manifolds using scattered data

We consider the task of recovering a Sobolev function on a connected com...
research
01/16/2020

Some convergent results for Backtracking Gradient Descent method on Banach spaces

Our main result concerns the following condition: Condition C. Let X ...
research
10/31/2022

Central limit theorem for intrinsic Frechet means in smooth compact Riemannian manifolds

We prove a central limit theorem (CLT) for the Frechet mean of independe...
research
02/11/2019

The Riemannian barycentre as a proxy for global optimisation

Let M be a simply-connected compact Riemannian symmetric space, and U a ...

Please sign up or login with your details

Forgot password? Click here to reset