Global Optimality of Local Search for Low Rank Matrix Recovery

05/23/2016
by   Srinadh Bhojanapalli, et al.
0

We show that there are no spurious local minima in the non-convex factorized parametrization of low-rank matrix recovery from incoherent linear measurements. With noisy measurements we show all local minima are very close to a global optimum. Together with a curvature bound at saddle points, this yields a polynomial time global convergence guarantee for stochastic gradient descent from random initialization.

READ FULL TEXT

page 5

page 11

page 12

research
05/18/2021

Sharp Restricted Isometry Property Bounds for Low-rank Matrix Recovery Problems with Corrupted Measurements

In this paper, we study a general low-rank matrix recovery problem with ...
research
04/29/2022

Escaping Spurious Local Minima of Low-Rank Matrix Factorization Through Convex Lifting

This work proposes a rapid global solver for nonconvex low-rank matrix f...
research
02/15/2023

Over-parametrization via Lifting for Low-rank Matrix Sensing: Conversion of Spurious Solutions to Strict Saddle Points

This paper studies the role of over-parametrization in solving non-conve...
research
06/30/2021

Deep Linear Networks Dynamics: Low-Rank Biases Induced by Initialization Scale and L2 Regularization

For deep linear networks (DLN), various hyperparameters alter the dynami...
research
11/07/2018

Global Optimality in Distributed Low-rank Matrix Factorization

We study the convergence of a variant of distributed gradient descent (D...
research
09/12/2016

Non-square matrix sensing without spurious local minima via the Burer-Monteiro approach

We consider the non-square matrix sensing problem, under restricted isom...
research
03/07/2022

Flat minima generalize for low-rank matrix recovery

Empirical evidence suggests that for a variety of overparameterized nonl...

Please sign up or login with your details

Forgot password? Click here to reset