Nonconvex Rectangular Matrix Completion via Gradient Descent without ℓ_2,∞ Regularization

01/18/2019
by   Ji Chen, et al.
0

The analysis of nonconvex matrix completion has recently attracted much attention in the community of machine learning thanks to its computational convenience. Existing analysis on this problem, however, usually relies on ℓ_2,∞ projection or regularization that involves unknown model parameters, although they are observed to be unnecessary in numerical simulations, see, e.g. Zheng and Lafferty [2016]. In this paper, we extend the analysis of the vanilla gradient descent for positive semidefinite matrix completion proposed in Ma et al. [2017] to the rectangular case, and more significantly, improve the required sampling complexity from O(r^3) to O(r^2). Our technical ideas and contributions are potentially useful in improving the leave-one-out analysis in other related problems.

READ FULL TEXT
research
05/23/2016

Convergence Analysis for Rectangular Matrix Completion Using Burer-Monteiro Factorization and Gradient Descent

We address the rectangular matrix completion problem by lifting the unkn...
research
01/21/2021

Robust spectral compressive sensing via vanilla gradient descent

This paper investigates robust recovery of an undamped or damped spectra...
research
04/08/2019

Binary matrix completion with nonconvex regularizers

Many practical problems involve the recovery of a binary matrix from par...
research
12/19/2022

Rank-1 Matrix Completion with Gradient Descent and Small Random Initialization

The nonconvex formulation of matrix completion problem has received sign...
research
03/29/2016

Unified View of Matrix Completion under General Structural Constraints

In this paper, we present a unified analysis of matrix completion under ...
research
08/01/2018

Matrix completion and extrapolation via kernel regression

Matrix completion and extrapolation (MCEX) are dealt with here over repr...

Please sign up or login with your details

Forgot password? Click here to reset