Fast Global Convergence for Low-rank Matrix Recovery via Riemannian Gradient Descent with Random Initialization

12/31/2020
by   Thomas Y. Hou, et al.
10

In this paper, we propose a new global analysis framework for a class of low-rank matrix recovery problems on the Riemannian manifold. We analyze the global behavior for the Riemannian optimization with random initialization. We use the Riemannian gradient descent algorithm to minimize a least squares loss function, and study the asymptotic behavior as well as the exact convergence rate. We reveal a previously unknown geometric property of the low-rank matrix manifold, which is the existence of spurious critical points for the simple least squares function on the manifold. We show that under some assumptions, the Riemannian gradient descent starting from a random initialization with high probability avoids these spurious critical points and only converges to the ground truth in nearly linear convergence rate, i.e. 𝒪(log(1/ϵ)+ log(n)) iterations to reach an ϵ-accurate solution. We use two applications as examples for our global analysis. The first one is a rank-1 matrix recovery problem. The second one is the Gaussian phase retrieval problem. The second example only satisfies the weak isometry property, but has behavior similar to that of the first one except for an extra saddle set. Our convergence guarantee is nearly optimal and almost dimension-free, which fully explains the numerical observations. The global analysis can be potentially extended to other data problems with random measurement structures and empirical least squares loss functions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/20/2021

Asymptotic Escape of Spurious Critical Points on the Low-rank Matrix Manifold

We show that the Riemannian gradient descent algorithm on the low-rank m...
research
05/30/2023

Fast global convergence of gradient descent for low-rank matrix approximation

This paper investigates gradient descent for solving low-rank matrix app...
research
11/17/2020

Recursive Importance Sketching for Rank Constrained Least Squares: Algorithms and High-order Convergence

In this paper, we propose a new ecursive mportance ketching algorithm fo...
research
11/28/2019

Analysis of Asymptotic Escape of Strict Saddle Sets in Manifold Optimization

In this paper, we provide some analysis on the asymptotic escape of stri...
research
07/20/2022

Alternating minimization for generalized rank one matrix sensing: Sharp predictions from a random initialization

We consider the problem of estimating the factors of a rank-1 matrix wit...
research
04/25/2022

Randomly Initialized Alternating Least Squares: Fast Convergence for Matrix Sensing

We consider the problem of reconstructing rank-one matrices from random ...
research
06/24/2015

Global Convergence of a Grassmannian Gradient Descent Algorithm for Subspace Estimation

It has been observed in a variety of contexts that gradient descent meth...

Please sign up or login with your details

Forgot password? Click here to reset