Nuclear norm penalization and optimal rates for noisy low rank matrix completion

11/29/2010
by   Vladimir Koltchinskii, et al.
0

This paper deals with the trace regression model where n entries or linear combinations of entries of an unknown m_1× m_2 matrix A_0 corrupted by noise are observed. We propose a new nuclear norm penalized estimator of A_0 and establish a general sharp oracle inequality for this estimator for arbitrary values of n,m_1,m_2 under the condition of isometry in expectation. Then this method is applied to the matrix completion problem. In this case, the estimator admits a simple explicit form and we prove that it satisfies oracle inequalities with faster rates of convergence than in the previous works. They are valid, in particular, in the high-dimensional setting m_1m_2≫ n. We show that the obtained rates are optimal up to logarithmic factors in a minimax sense and also derive, for any fixed matrix A_0, a non-minimax lower bound on the rate of convergence of our estimator, which coincides with the upper bound up to a constant factor. Finally, we show that our procedure provides an exact recovery of the rank of A_0 with probability close to 1. We also discuss the statistical learning setting where there is no underlying model determined by A_0 and the aim is to find the best trace regression model approximating the data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/24/2015

Low Rank Matrix Completion with Exponential Family Noise

The matrix completion problem consists in reconstructing a matrix from a...
research
05/05/2021

On the Optimality of Nuclear-norm-based Matrix Completion for Problems with Smooth Non-linear Structure

Originally developed for imputing missing entries in low rank, or approx...
research
12/12/2020

Outlier-robust sparse/low-rank least-squares regression and robust matrix completion

We consider high-dimensional least-squares regression when a fraction ϵ ...
research
06/25/2018

Exponential weights in multivariate regression and a low-rankness favoring prior

We establish theoretical guarantees for the expected prediction error of...
research
02/17/2018

Nonparametric estimation of low rank matrix valued function

Let A:[0,1]→H_m (the space of Hermitian matrices) be a matrix valued fun...
research
03/04/2021

High-dimensional estimation of quadratic variation based on penalized realized variance

In this paper, we develop a penalized realized variance (PRV) estimator ...
research
03/08/2023

Two-sided Matrix Regression

The two-sided matrix regression model Y = A^*X B^* +E aims at predicting...

Please sign up or login with your details

Forgot password? Click here to reset