Gauss-Southwell type descent methods for low-rank matrix optimization

06/01/2023
by   Guillaume Olikier, et al.
0

We consider gradient-related methods for low-rank matrix optimization with a smooth cost function. The methods operate on single factors of the low-rank factorization and share aspects of both alternating and Riemannian optimization. Two possible choices for the search directions based on Gauss-Southwell type selection rules are compared: one using the gradient of a factorized non-convex formulation, the other using the Riemannian gradient. While both methods provide gradient convergence guarantees that are similar to the unconstrained case, the version based on Riemannian gradient is significantly more robust with respect to small singular values and the condition number of the cost function, as illustrated by numerical experiments. As a side result of our approach, we also obtain new convergence results for the alternating least squares method.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/20/2021

Asymptotic Escape of Spurious Critical Points on the Low-rank Matrix Manifold

We show that the Riemannian gradient descent algorithm on the low-rank m...
research
10/26/2020

Low-Rank Matrix Recovery with Scaled Subgradient Methods: Fast and Robust Convergence Without the Condition Number

Many problems in data science can be treated as estimating a low-rank ma...
research
11/29/2021

Local convergence of alternating low-rank optimization methods with overrelaxation

The local convergence of alternating optimization methods with overrelax...
research
03/03/2021

Riemannian thresholding methods for row-sparse and low-rank matrix recovery

In this paper, we present modifications of the iterative hard thresholdi...
research
09/13/2021

Nonlinear matrix recovery using optimization on the Grassmann manifold

We investigate the problem of recovering a partially observed high-rank ...
research
03/06/2023

Critical Points and Convergence Analysis of Generative Deep Linear Networks Trained with Bures-Wasserstein Loss

We consider a deep matrix factorization model of covariance matrices tra...
research
09/07/2022

Manifold Free Riemannian Optimization

Riemannian optimization is a principled framework for solving optimizati...

Please sign up or login with your details

Forgot password? Click here to reset