Analyzing Tensor Power Method Dynamics in Overcomplete Regime

11/06/2014
by   Anima Anandkumar, et al.
0

We present a novel analysis of the dynamics of tensor power iterations in the overcomplete regime where the tensor CP rank is larger than the input dimension. Finding the CP decomposition of an overcomplete tensor is NP-hard in general. We consider the case where the tensor components are randomly drawn, and show that the simple power iteration recovers the components with bounded error under mild initialization conditions. We apply our analysis to unsupervised learning of latent variable models, such as multi-view mixture models and spherical Gaussian mixtures. Given the third order moment tensor, we learn the parameters using tensor power iterations. We prove it can correctly learn the model parameters when the number of hidden components k is much larger than the data dimension d, up to k = o(d^1.5). We initialize the power iterations with data samples and prove its success under mild conditions on the signal-to-noise ratio of the samples. Our analysis significantly expands the class of latent variable models where spectral methods are applicable. Our analysis also deals with noise in the input tensor leading to sample complexity result in the application to learning latent variable models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/03/2014

Sample Complexity Analysis for Learning Overcomplete Latent Variable Models through Tensor Methods

We provide guarantees for learning latent variable models emphasizing on...
research
11/13/2013

Nonparametric Estimation of Multi-View Latent Variable Models

Spectral methods have greatly advanced the estimation of latent variable...
research
02/27/2018

Learning Binary Latent Variable Models: A Tensor Eigenpair Approach

Latent variable models with hidden binary units appear in various applic...
research
12/28/2016

Provable learning of Noisy-or Networks

Many machine learning applications use latent variable models to explain...
research
11/07/2022

Lower Bounds for the Convergence of Tensor Power Iteration on Random Overcomplete Models

Tensor decomposition serves as a powerful primitive in statistics and ma...
research
12/09/2014

Provable Tensor Methods for Learning Mixtures of Generalized Linear Models

We consider the problem of learning mixtures of generalized linear model...
research
02/05/2015

Provable Sparse Tensor Decomposition

We propose a novel sparse tensor decomposition method, namely Tensor Tru...

Please sign up or login with your details

Forgot password? Click here to reset