Exact Linear Convergence Rate Analysis for Low-Rank Symmetric Matrix Completion via Gradient Descent

02/04/2021
by   Trung Vu, et al.
0

Factorization-based gradient descent is a scalable and efficient algorithm for solving low-rank matrix completion. Recent progress in structured non-convex optimization has offered global convergence guarantees for gradient descent under certain statistical assumptions on the low-rank matrix and the sampling set. However, while the theory suggests gradient descent enjoys fast linear convergence to a global solution of the problem, the universal nature of the bounding technique prevents it from obtaining an accurate estimate of the rate of convergence. In this paper, we perform a local analysis of the exact linear convergence rate of gradient descent for factorization-based matrix completion for symmetric matrices. Without any additional assumptions on the underlying model, we identify the deterministic condition for local convergence of gradient descent, which only depends on the solution matrix and the sampling set. More crucially, our analysis provides a closed-form expression of the asymptotic rate of convergence that matches exactly with the linear convergence observed in practice. To the best of our knowledge, our result is the first one that offers the exact rate of convergence of gradient descent for matrix factorization in Euclidean space for matrix completion.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/07/2022

Structured Gradient Descent for Fast Robust Low-Rank Hankel Matrix Completion

We study the robust matrix completion problem for the low-rank Hankel ma...
research
03/13/2022

On the analysis of optimization with fixed-rank matrices: a quotient geometric view

We study a type of Riemannian gradient descent (RGD) algorithm, designed...
research
06/27/2021

Global Convergence of Gradient Descent for Asymmetric Low-Rank Matrix Factorization

We study the asymmetric low-rank factorization problem: min_𝐔∈ℝ^m ×...
research
12/29/2021

On Local Convergence of Iterative Hard Thresholding for Matrix Completion

Iterative hard thresholding (IHT) has gained in popularity over the past...
research
12/22/2021

On Asymptotic Linear Convergence of Projected Gradient Descent for Constrained Least Squares

Many recent problems in signal processing and machine learning such as c...
research
06/07/2022

Preconditioned Gradient Descent for Overparameterized Nonconvex Burer–Monteiro Factorization with Global Optimality Certification

We consider using gradient descent to minimize the nonconvex function f(...
research
04/25/2019

Gradient Descent for Sparse Rank-One Matrix Completion for Crowd-Sourced Aggregation of Sparsely Interacting Workers

We consider worker skill estimation for the single-coin Dawid-Skene crow...

Please sign up or login with your details

Forgot password? Click here to reset