Fast global convergence of gradient descent for low-rank matrix approximation

05/30/2023
by   Hengchao Chen, et al.
0

This paper investigates gradient descent for solving low-rank matrix approximation problems. We begin by establishing the local linear convergence of gradient descent for symmetric matrix approximation. Building on this result, we prove the rapid global convergence of gradient descent, particularly when initialized with small random values. Remarkably, we show that even with moderate random initialization, which includes small random initialization as a special case, gradient descent achieves fast global convergence in scenarios where the top eigenvalues are identical. Furthermore, we extend our analysis to address asymmetric matrix approximation problems and investigate the effectiveness of a retraction-free eigenspace computation method. Numerical experiments strongly support our theory. In particular, the retraction-free algorithm outperforms the corresponding Riemannian gradient descent method, resulting in a significant 29% reduction in runtime.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/31/2020

Fast Global Convergence for Low-rank Matrix Recovery via Riemannian Gradient Descent with Random Initialization

In this paper, we propose a new global analysis framework for a class of...
research
11/07/2018

Global Optimality in Distributed Low-rank Matrix Factorization

We study the convergence of a variant of distributed gradient descent (D...
research
05/11/2023

Convergence of Alternating Gradient Descent for Matrix Factorization

We consider alternating gradient descent (AGD) with fixed step size η > ...
research
06/24/2015

Global Convergence of a Grassmannian Gradient Descent Algorithm for Subspace Estimation

It has been observed in a variety of contexts that gradient descent meth...
research
03/06/2022

Algorithmic Regularization in Model-free Overparametrized Asymmetric Matrix Factorization

We study the asymmetric matrix factorization problem under a natural non...
research
03/22/2022

Local Stochastic Factored Gradient Descent for Distributed Quantum State Tomography

We propose a distributed Quantum State Tomography (QST) protocol, named ...
research
09/04/2023

Asymmetric matrix sensing by gradient descent with small random initialization

We study matrix sensing, which is the problem of reconstructing a low-ra...

Please sign up or login with your details

Forgot password? Click here to reset