Optimal tuning-free convex relaxation for noisy matrix completion

by   Yuepeng Yang, et al.

This paper is concerned with noisy matrix completion–the problem of recovering a low-rank matrix from partial and noisy entries. Under uniform sampling and incoherence assumptions, we prove that a tuning-free square-root matrix completion estimator (square-root MC) achieves optimal statistical performance for solving the noisy matrix completion problem. Similar to the square-root Lasso estimator in high-dimensional linear regression, square-root MC does not rely on the knowledge of the size of the noise. While solving square-root MC is a convex program, our statistical analysis of square-root MC hinges on its intimate connections to a nonconvex rank-constrained estimator.


page 1

page 2

page 3

page 4


Noisy Matrix Completion: Understanding Statistical Guarantees for Convex Relaxation via Nonconvex Optimization

This paper studies noisy low-rank matrix completion: given partial and c...

Insights and algorithms for the multivariate square-root lasso

We study the multivariate square-root lasso, a method for fitting the mu...

Square Root Principal Component Pursuit: Tuning-Free Noisy Robust Matrix Recovery

We propose a new framework – Square Root Principal Component Pursuit – f...

Matrix Completion from Quantized Samples via Generalized Sparse Bayesian Learning

The recovery of a low rank matrix from a subset of noisy low-precision q...

Matrix Completion of World Trade

This work applies Matrix Completion (MC) – a class of machine-learning m...

Matrix completion and extrapolation via kernel regression

Matrix completion and extrapolation (MCEX) are dealt with here over repr...

Square Root Marginalization for Sliding-Window Bundle Adjustment

In this paper we propose a novel square root sliding-window bundle adjus...

Please sign up or login with your details

Forgot password? Click here to reset