Nonsmooth Optimization over Stiefel Manifold: Riemannian Subgradient Methods

11/12/2019
by   Xiao Li, et al.
0

Nonsmooth Riemannian optimization is a still under explored subfield of manifold optimization. In this paper, we study optimization problems over the Stiefel manifold with nonsmooth objective function. This type of problems appears widely in the engineering field. We propose to address these problems with Riemannian subgradient type methods including: Riemannian full, incremental, and stochastic subgradient methods. When the objective function is weakly convex, we show iteration complexity O(ε^-4) for these algorithms to achieve an ε-small surrogate stationary measure. Moreover, local linear convergence can be achieved for Riemannian full and incremental subgradient methods if the optimization problem further satisfies the sharpness regularity property. The fundamental ingredient for establishing the aforementioned convergence results is that any locally Lipschitz continuous weakly convex function in the Euclidean space admits a Riemannian subgradient inequality uniformly over the Stiefel manifold, which is of independent interest. We then extend our convergence results to a broader class of compact Riemannian manifolds embedded in Euclidean space. Finally, as a demonstration of applications, we discuss the sharpness property for robust subspace recovery and orthogonal dictionary learning and conduct experiments on the two problems to illustrate the performance of our algorithms.

READ FULL TEXT

page 1

page 2

page 3

page 4

05/03/2020

Riemannian Stochastic Proximal Gradient Methods for Nonsmooth Optimization over the Stiefel Manifold

Riemannian optimization has drawn a lot of attention due to its wide app...
08/01/2019

Lagrange Multipliers and Rayleigh Quotient Iteration in Constrained Type Equations

We generalize the Rayleigh quotient iteration to a class of functions ca...
08/09/2022

Partial Least Square Regression via Three-factor SVD-type Manifold Optimization for EEG Decoding

Partial least square regression (PLSR) is a widely-used statistical mode...
07/15/2022

Riemannian Natural Gradient Methods

This paper studies large-scale optimization problems on Riemannian manif...
03/25/2020

Zeroth-order Optimization on Riemannian Manifolds

We propose and analyze zeroth-order algorithms for optimization over Rie...
06/14/2022

The Dynamics of Riemannian Robbins-Monro Algorithms

Many important learning algorithms, such as stochastic gradient methods,...
06/22/2022

On a class of geodesically convex optimization problems solved via Euclidean MM methods

We study geodesically convex (g-convex) problems that can be written as ...