Robust Matrix Completion with Heavy-tailed Noise

06/09/2022
by   Bingyan Wang, et al.
9

This paper studies low-rank matrix completion in the presence of heavy-tailed and possibly asymmetric noise, where we aim to estimate an underlying low-rank matrix given a set of highly incomplete noisy entries. Though the matrix completion problem has attracted much attention in the past decade, there is still lack of theoretical understanding when the observations are contaminated by heavy-tailed noises. Prior theory falls short of explaining the empirical results and is unable to capture the optimal dependence of the estimation error on the noise level. In this paper, we adopt an adaptive Huber loss to accommodate heavy-tailed noise, which is robust against large and possibly asymmetric errors when the parameter in the loss function is carefully designed to balance the Huberization biases and robustness to outliers. Then, we propose an efficient nonconvex algorithm via a balanced low-rank Burer-Monteiro matrix factorization and gradient decent with robust spectral initialization. We prove that under merely bounded second moment condition on the error distributions, rather than the sub-Gaussian assumption, the Euclidean error of the iterates generated by the proposed algorithm decrease geometrically fast until achieving a minimax-optimal statistical estimation error, which has the same order as that in the sub-Gaussian case. The key technique behind this significant advancement is a powerful leave-one-out analysis framework. The theoretical results are corroborated by our simulation studies.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/25/2020

Adversarial Robust Low Rank Matrix Estimation: Compressed Sensing and Matrix Completion

We consider robust low rank matrix estimation when random noise is heavy...
research
05/23/2023

Two Results on Low-Rank Heavy-Tailed Multiresponse Regressions

This paper gives two theoretical results on estimating low-rank paramete...
research
02/20/2019

Noisy Matrix Completion: Understanding Statistical Guarantees for Convex Relaxation via Nonconvex Optimization

This paper studies noisy low-rank matrix completion: given partial and c...
research
07/16/2020

Understanding Implicit Regularization in Over-Parameterized Nonlinear Statistical Model

We study the implicit regularization phenomenon induced by simple optimi...
research
03/02/2022

Computationally Efficient and Statistically Optimal Robust Low-rank Matrix and Tensor Estimation

Low-rank matrix estimation under heavy-tailed noise is challenging, both...
research
02/26/2022

High Dimensional Statistical Estimation under One-bit Quantization

Compared with data with high precision, one-bit (binary) data are prefer...
research
09/20/2019

Robust Estimation and Shrinkage in Ultrahigh Dimensional Expectile Regression with Heavy Tails and Variance Heterogeneity

High-dimensional data subject to heavy-tailed phenomena and heterogeneit...

Please sign up or login with your details

Forgot password? Click here to reset