Square Deal: Lower Bounds and Improved Relaxations for Tensor Recovery

07/22/2013
by   Cun Mu, et al.
0

Recovering a low-rank tensor from incomplete information is a recurring problem in signal processing and machine learning. The most popular convex relaxation of this problem minimizes the sum of the nuclear norms of the unfoldings of the tensor. We show that this approach can be substantially suboptimal: reliably recovering a K-way tensor of length n and Tucker rank r from Gaussian measurements requires Ω(r n^K-1) observations. In contrast, a certain (intractable) nonconvex formulation needs only O(r^K + nrK) observations. We introduce a very simple, new convex relaxation, which partially bridges this gap. Our new formulation succeeds with O(r^ K/2 n^ K/2 ) observations. While these results pertain to Gaussian measurements, simulations strongly suggest that the new norm also outperforms the sum of nuclear norms for tensor completion from a random subset of entries. Our lower bound for the sum-of-nuclear-norms model follows from a new result on recovering signals with multiple sparse structures (e.g. sparse, low rank), which perhaps surprisingly demonstrates the significant suboptimality of the commonly used recovery approach via minimizing the sum of individual sparsity inducing norms (e.g. l_1, nuclear norm). Our new formulation for low-rank tensor recovery however opens the possibility in reducing the sample complexity by exploiting several structures jointly.

READ FULL TEXT
research
07/25/2017

Scaled Nuclear Norm Minimization for Low-Rank Tensor Completion

Minimizing the nuclear norm of a matrix has been shown to be very effici...
research
04/16/2019

Simultaneous structures in convex signal recovery - revisiting the convex combination of norms

In compressed sensing one uses known structures of otherwise unknown sig...
research
06/26/2021

Recovery from Power Sums

We study the problem of recovering a collection of n numbers from the ev...
research
01/26/2015

Noisy Tensor Completion via the Sum-of-Squares Hierarchy

In the noisy tensor completion problem we observe m entries (whose locat...
research
10/17/2018

Efficient Proximal Mapping Computation for Unitarily Invariant Low-Rank Inducing Norms

Low-rank inducing unitarily invariant norms have been introduced to conv...
research
03/26/2013

Convex Tensor Decomposition via Structured Schatten Norm Regularization

We discuss structured Schatten norms for tensor decomposition that inclu...
research
04/06/2016

Hankel Matrix Nuclear Norm Regularized Tensor Completion for N-dimensional Exponential Signals

Signals are generally modeled as a superposition of exponential function...

Please sign up or login with your details

Forgot password? Click here to reset