A Nonconvex Free Lunch for Low-Rank plus Sparse Matrix Recovery

02/21/2017
by   Xiao Zhang, et al.
0

We study the problem of low-rank plus sparse matrix recovery. We propose a generic and efficient nonconvex optimization algorithm based on projected gradient descent and double thresholding operator, with much lower computational complexity. Compared with existing convex-relaxation based methods, the proposed algorithm recovers the low-rank plus sparse matrices for free, without incurring any additional statistical cost. It not only enables exact recovery of the unknown low-rank and sparse matrices in the noiseless setting, and achieves minimax optimal statistical error rate in the noisy case, but also matches the best-known robustness guarantee (i.e., tolerance for sparse corruption). At the core of our theory is a novel structural Lipschitz gradient condition for low-rank plus sparse matrices, which is essential for proving the linear convergence rate of our algorithm, and we believe is of independent interest to prove fast rates for general superposition-structured models. We demonstrate the superiority of our generic algorithm, both theoretically and experimentally, through three concrete applications: robust matrix sensing, robust PCA and one-bit matrix decomposition.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/09/2017

A Universal Variance Reduction-Based Catalyst for Nonconvex Low-Rank Matrix Recovery

We propose a generic framework based on a new stochastic variance-reduce...
research
02/20/2018

Recovery of simultaneous low rank and two-way sparse coefficient matrices, a nonconvex approach

We study the problem of recovery of matrices that are simultaneously low...
research
02/23/2011

Noisy matrix decomposition via convex relaxation: Optimal rates in high dimensions

We analyze a class of estimators based on convex relaxation for solving ...
research
01/29/2022

Exact Decomposition of Joint Low Rankness and Local Smoothness Plus Sparse Matrices

It is known that the decomposition in low-rank and sparse matrices (L+S ...
research
08/01/2019

Low-Rank plus Sparse Decomposition of Covariance Matrices using Neural Network Parametrization

This paper revisits the problem of decomposing a positive semidefinite m...
research
07/07/2023

Improved Algorithms for White-Box Adversarial Streams

We study streaming algorithms in the white-box adversarial stream model,...
research
02/20/2014

Multi-Step Stochastic ADMM in High Dimensions: Applications to Sparse Optimization and Noisy Matrix Decomposition

We propose an efficient ADMM method with guarantees for high-dimensional...

Please sign up or login with your details

Forgot password? Click here to reset