Noisy Tensor Completion via the Sum-of-Squares Hierarchy

01/26/2015
by   Boaz Barak, et al.
0

In the noisy tensor completion problem we observe m entries (whose location is chosen uniformly at random) from an unknown n_1 × n_2 × n_3 tensor T. We assume that T is entry-wise close to being rank r. Our goal is to fill in its missing entries using as few observations as possible. Let n = (n_1, n_2, n_3). We show that if m = n^3/2 r then there is a polynomial time algorithm based on the sixth level of the sum-of-squares hierarchy for completing it. Our estimate agrees with almost all of T's entries almost exactly and works even when our observations are corrupted by noise. This is also the first algorithm for tensor completion that works in the overcomplete case when r > n, and in fact it works all the way up to r = n^3/2-ϵ. Our proofs are short and simple and are based on establishing a new connection between noisy tensor completion (through the language of Rademacher complexity) and the task of refuting random constant satisfaction problems. This connection seems to have gone unnoticed even in the context of matrix completion. Furthermore, we use this connection to show matching lower bounds. Our main technical result is in characterizing the Rademacher complexity of the sequence of norms that arise in the sum-of-squares relaxations to the tensor nuclear norm. These results point to an interesting new direction: Can we explore computational vs. sample complexity tradeoffs through the sum-of-squares hierarchy?

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/21/2017

Exact tensor completion with sum-of-squares

We obtain the first polynomial-time algorithm for exact tensor completio...
research
05/30/2019

Sum-of-squares meets square loss: Fast rates for agnostic tensor completion

We study tensor completion in the agnostic setting. In the classical ten...
research
07/22/2013

Square Deal: Lower Bounds and Improved Relaxations for Tensor Recovery

Recovering a low-rank tensor from incomplete information is a recurring ...
research
07/06/2014

Dictionary Learning and Tensor Decomposition via the Sum-of-Squares Method

We give a new approach to the dictionary learning (also known as "sparse...
research
12/16/2020

On O( max{n_1, n_2 }log ( max{ n_1, n_2 } n_3) ) Sample Entries for n_1 × n_2 × n_3 Tensor Completion via Unitary Transformation

One of the key problems in tensor completion is the number of uniformly ...
research
06/15/2020

Uncertainty quantification for nonconvex tensor completion: Confidence intervals, heteroscedasticity and optimality

We study the distribution and uncertainty of nonconvex optimization for ...
research
06/13/2018

Brain-Computer Interface with Corrupted EEG Data: A Tensor Completion Approach

One of the current issues in Brain-Computer Interface is how to deal wit...

Please sign up or login with your details

Forgot password? Click here to reset