Dictionary Learning and Tensor Decomposition via the Sum-of-Squares Method

07/06/2014
by   Boaz Barak, et al.
0

We give a new approach to the dictionary learning (also known as "sparse coding") problem of recovering an unknown n× m matrix A (for m ≥ n) from examples of the form y = Ax + e, where x is a random vector in R^m with at most τ m nonzero coordinates, and e is a random noise vector in R^n with bounded magnitude. For the case m=O(n), our algorithm recovers every column of A within arbitrarily good constant accuracy in time m^O( m/(τ^-1)), in particular achieving polynomial time if τ = m^-δ for any δ>0, and time m^O( m) if τ is (a sufficiently small) constant. Prior algorithms with comparable assumptions on the distribution required the vector x to be much sparser---at most √(n) nonzero coordinates---and there were intrinsic barriers preventing these algorithms from applying for denser x. We achieve this by designing an algorithm for noisy tensor decomposition that can recover, under quite general conditions, an approximate rank-one decomposition of a tensor T, given access to a tensor T' that is τ-close to T in the spectral norm (when considered as a matrix). To our knowledge, this is the first algorithm for tensor decomposition that works in the constant spectral-norm noise regime, where there is no guarantee that the local optima of T and T' have similar structures. Our algorithm is based on a novel approach to using and analyzing the Sum of Squares semidefinite programming hierarchy (Parrilo 2000, Lasserre 2001), and it can be viewed as an indication of the utility of this very general and powerful tool for unsupervised learning problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/27/2017

Fast and robust tensor decomposition with applications to dictionary learning

We develop fast spectral algorithms for tensor decomposition that match ...
research
12/08/2015

Fast spectral algorithms from sum-of-squares proofs: tensor decomposition and planted sparse vectors

We consider two problems that arise in machine learning applications: th...
research
01/26/2015

Noisy Tensor Completion via the Sum-of-Squares Hierarchy

In the noisy tensor completion problem we observe m entries (whose locat...
research
02/14/2022

Fast algorithm for overcomplete order-3 tensor decomposition

We develop the first fast spectral algorithm to decompose a random third...
research
03/05/2022

A Robust Spectral Algorithm for Overcomplete Tensor Decomposition

We give a spectral algorithm for decomposing overcomplete order-4 tensor...
research
06/10/2015

Convolutional Dictionary Learning through Tensor Factorization

Tensor methods have emerged as a powerful paradigm for consistent learni...
research
05/31/2021

Optimal Spectral Recovery of a Planted Vector in a Subspace

Recovering a planted vector v in an n-dimensional random subspace of ℝ^N...

Please sign up or login with your details

Forgot password? Click here to reset