Rank-1 Matrix Completion with Gradient Descent and Small Random Initialization

12/19/2022
by   Daesung Kim, et al.
0

The nonconvex formulation of matrix completion problem has received significant attention in recent years due to its affordable complexity compared to the convex formulation. Gradient descent (GD) is the simplest yet efficient baseline algorithm for solving nonconvex optimization problems. The success of GD has been witnessed in many different problems in both theory and practice when it is combined with random initialization. However, previous works on matrix completion require either careful initialization or regularizers to prove the convergence of GD. In this work, we study the rank-1 symmetric matrix completion and prove that GD converges to the ground truth when small random initialization is used. We show that in logarithmic amount of iterations, the trajectory enters the region where local convergence occurs. We provide an upper bound on the initialization size that is sufficient to guarantee the convergence and show that a larger initialization can be used as more samples are available. We observe that implicit regularization effect of GD plays a critical role in the analysis, and for the entire trajectory, it prevents each entry from becoming much larger than the others.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/23/2016

Convergence Analysis for Rectangular Matrix Completion Using Burer-Monteiro Factorization and Gradient Descent

We address the rectangular matrix completion problem by lifting the unkn...
research
01/18/2019

Nonconvex Rectangular Matrix Completion via Gradient Descent without ℓ_2,∞ Regularization

The analysis of nonconvex matrix completion has recently attracted much ...
research
04/08/2019

Binary matrix completion with nonconvex regularizers

Many practical problems involve the recovery of a binary matrix from par...
research
08/24/2022

Accelerating SGD for Highly Ill-Conditioned Huge-Scale Online Matrix Completion

The matrix completion problem seeks to recover a d× d ground truth matri...
research
09/10/2015

Fast low-rank estimation by projected gradient descent: General statistical and algorithmic guarantees

Optimization problems with rank constraints arise in many applications, ...

Please sign up or login with your details

Forgot password? Click here to reset