Riemannian Natural Gradient Methods

07/15/2022
by   Jiang Hu, et al.
0

This paper studies large-scale optimization problems on Riemannian manifolds whose objective function is a finite sum of negative log-probability losses. Such problems arise in various machine learning and signal processing applications. By introducing the notion of Fisher information matrix in the manifold setting, we propose a novel Riemannian natural gradient method, which can be viewed as a natural extension of the natural gradient method from the Euclidean setting to the manifold setting. We establish the almost-sure global convergence of our proposed method under standard assumptions. Moreover, we show that if the loss function satisfies certain convexity and smoothness conditions and the input-output map satisfies a Riemannian Jacobian stability condition, then our proposed method enjoys a local linear – or, under the Lipschitz continuity of the Riemannian Jacobian of the input-output map, even quadratic – rate of convergence. We then prove that the Riemannian Jacobian stability condition will be satisfied by a two-layer fully connected neural network with batch normalization with high probability, provided that the width of the network is sufficiently large. This demonstrates the practical relevance of our convergence rate result. Numerical experiments on applications arising from machine learning demonstrate the advantages of the proposed method over state-of-the-art ones.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/16/2023

Decentralized Riemannian natural gradient methods with Kronecker-product approximations

With a computationally efficient approximation of the second-order infor...
research
05/03/2020

Riemannian Stochastic Proximal Gradient Methods for Nonsmooth Optimization over the Stiefel Manifold

Riemannian optimization has drawn a lot of attention due to its wide app...
research
11/12/2019

Nonsmooth Optimization over Stiefel Manifold: Riemannian Subgradient Methods

Nonsmooth Riemannian optimization is a still under explored subfield of ...
research
03/31/2023

Decentralized Weakly Convex Optimization Over the Stiefel Manifold

We focus on a class of non-smooth optimization problems over the Stiefel...
research
04/09/2021

A Riemannian smoothing steepest descent method for non-Lipschitz optimization on submanifolds

In this paper, we propose a Riemannian smoothing steepest descent method...
research
01/22/2021

On the Local Linear Rate of Consensus on the Stiefel Manifold

We study the convergence properties of Riemannian gradient method for so...
research
07/18/2023

Modified memoryless spectral-scaling Broyden family on Riemannian manifolds

This paper presents modified memoryless quasi-Newton methods based on th...

Please sign up or login with your details

Forgot password? Click here to reset