Rank Overspecified Robust Matrix Recovery: Subgradient Method and Exact Recovery

09/23/2021
by   Lijun Ding, et al.
0

We study the robust recovery of a low-rank matrix from sparsely and grossly corrupted Gaussian measurements, with no prior knowledge on the intrinsic rank. We consider the robust matrix factorization approach. We employ a robust ℓ_1 loss function and deal with the challenge of the unknown rank by using an overspecified factored representation of the matrix variable. We then solve the associated nonconvex nonsmooth problem using a subgradient method with diminishing stepsizes. We show that under a regularity condition on the sensing matrices and corruption, which we call restricted direction preserving property (RDPP), even with rank overspecified, the subgradient method converges to the exact low-rank solution at a sublinear rate. Moreover, our result is more general in the sense that it automatically speeds up to a linear rate once the factor rank matches the unknown rank. On the other hand, we show that the RDPP condition holds under generic settings, such as Gaussian measurements under independent or adversarial sparse corruptions, where the result could be of independent interest. Both the exact recovery and the convergence rate of the proposed subgradient method are numerically verified in the overspecified regime. Moreover, our experiment further shows that our particular design of diminishing stepsize effectively prevents overfitting for robust recovery under overparameterized models, such as robust matrix sensing and learning robust deep image prior. This regularization effect is worth further investigation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/28/2018

Matrix Recovery from Rank-One Projection Measurements via Nonconvex Minimization

In this paper, we consider the matrix recovery from rank-one projection ...
research
09/24/2018

Nonconvex Robust Low-rank Matrix Recovery

In this paper we study the problem of recovering a low-rank matrix from ...
research
06/16/2020

Robust Recovery via Implicit Bias of Discrepant Learning Rates for Double Over-parameterization

Recent advances have shown that implicit bias of gradient descent on ove...
research
02/17/2022

Global Convergence of Sub-gradient Method for Robust Matrix Recovery: Small Initialization, Noisy Measurements, and Over-parameterization

In this work, we study the performance of sub-gradient method (SubGM) on...
research
01/19/2019

One-Bit Sensing of Low-Rank and Bisparse Matrices

This note studies the worst-case recovery error of low-rank and bisparse...
research
02/25/2020

On the regularity and conditioning of low rank semidefinite programs

Low rank matrix recovery problems appear widely in statistics, combinato...
research
07/27/2019

A Greedy Algorithm for Matrix Recovery with Subspace Prior Information

Matrix recovery is the problem of recovering a low-rank matrix from a fe...

Please sign up or login with your details

Forgot password? Click here to reset