Smoothed Analysis of Tensor Decompositions

11/14/2013
by   Aditya Bhaskara, et al.
0

Low rank tensor decompositions are a powerful tool for learning generative models, and uniqueness results give them a significant advantage over matrix decomposition methods. However, tensors pose significant algorithmic challenges and tensors analogs of much of the matrix algebra toolkit are unlikely to exist because of hardness results. Efficient decomposition in the overcomplete case (where rank exceeds dimension) is particularly challenging. We introduce a smoothed analysis model for studying these questions and develop an efficient algorithm for tensor decomposition in the highly overcomplete case (rank polynomial in the dimension). In this setting, we show that our algorithm is robust to inverse polynomial error -- a crucial property for applications in learning since we are only allowed a polynomial number of samples. While algorithms are known for exact tensor decomposition in some overcomplete settings, our main contribution is in analyzing their stability in the framework of smoothed analysis. Our main technical contribution is to show that tensor products of perturbed vectors are linearly independent in a robust sense (i.e. the associated matrix has singular values that are at least an inverse polynomial). This key result paves the way for applying tensor methods to learning problems in the smoothed setting. In particular, we use it to obtain results for learning multi-view models and mixtures of axis-aligned Gaussians where there are many more "components" than dimensions. The assumption here is that the model is not adversarially chosen, formalized by a perturbation of model parameters. We believe this an appealing way to analyze realistic instances of learning problems, since this framework allows us to overcome many of the usual limitations of using tensor methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/21/2015

Decomposing Overcomplete 3rd Order Tensors using Sum-of-Squares Algorithms

Tensor rank and low-rank tensor decompositions have many applications in...
research
03/29/2021

An Orthogonal Equivalence Theorem for Third Order Tensors

In 2011, Kilmer and Martin proposed tensor singular value decomposition ...
research
07/08/2021

Decomposition algorithms for tensors and polynomials

We give algorithms to compute decompositions of a given polynomial, or m...
research
06/25/2013

Fourier PCA and Robust Tensor Decomposition

Fourier PCA is Principal Component Analysis of a matrix obtained from hi...
research
12/09/2014

Score Function Features for Discriminative Learning: Matrix and Tensor Framework

Feature learning forms the cornerstone for tackling challenging learning...
research
11/29/2018

Smoothed Analysis in Unsupervised Learning via Decoupling

Smoothed analysis is a powerful paradigm in overcoming worst-case intrac...
research
08/16/2016

Shape Constrained Tensor Decompositions using Sparse Representations in Over-Complete Libraries

We consider N-way data arrays and low-rank tensor factorizations where t...

Please sign up or login with your details

Forgot password? Click here to reset