Low-Rank Matrix Recovery with Scaled Subgradient Methods: Fast and Robust Convergence Without the Condition Number

10/26/2020
by   Tian Tong, et al.
0

Many problems in data science can be treated as estimating a low-rank matrix from highly incomplete, sometimes even corrupted, observations. One popular approach is to resort to matrix factorization, where the low-rank matrix factors are optimized via first-order methods over a smooth loss function, such as the residual sum of squares. While tremendous progresses have been made in recent years, the natural smooth formulation suffers from two sources of ill-conditioning, where the iteration complexity of gradient descent scales poorly both with the dimension as well as the condition number of the low-rank matrix. Moreover, the smooth formulation is not robust to corruptions. In this paper, we propose scaled subgradient methods to minimize a family of nonsmooth and nonconvex formulations – in particular, the residual sum of absolute errors – which is guaranteed to converge at a fast rate that is almost dimension-free and independent of the condition number, even in the presence of corruptions. We illustrate the effectiveness of our approach when the observation operator satisfies certain mixed-norm restricted isometry properties, and derive state-of-the-art performance guarantees for a variety of problems such as robust low-rank matrix sensing and quadratic sampling.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/22/2019

Low-rank matrix recovery with composite optimization: good conditioning and rapid convergence

The task of recovering a low-rank matrix from its noisy linear measureme...
research
02/02/2023

The Power of Preconditioning in Overparameterized Low-Rank Matrix Sensing

We propose , a preconditioned gradient descent method to tackle the low-...
research
06/01/2023

Gauss-Southwell type descent methods for low-rank matrix optimization

We consider gradient-related methods for low-rank matrix optimization wi...
research
03/10/2023

Deflated HeteroPCA: Overcoming the curse of ill-conditioning in heteroskedastic PCA

This paper is concerned with estimating the column subspace of a low-ran...
research
05/18/2020

Accelerating Ill-Conditioned Low-Rank Matrix Estimation via Scaled Gradient Descent

Low-rank matrix estimation is a canonical problem that finds numerous ap...
research
08/31/2020

Low-rank matrix recovery with non-quadratic loss: projected gradient method and regularity projection oracle

Existing results for low-rank matrix recovery largely focus on quadratic...
research
02/15/2018

Fast Generalized Conditional Gradient Method with Applications to Matrix Recovery Problems

Motivated by matrix recovery problems such as Robust Principal Component...

Please sign up or login with your details

Forgot password? Click here to reset