Provable Sparse Tensor Decomposition

02/05/2015
by   Will Wei Sun, et al.
0

We propose a novel sparse tensor decomposition method, namely Tensor Truncated Power (TTP) method, that incorporates variable selection into the estimation of decomposition components. The sparsity is achieved via an efficient truncation step embedded in the tensor power iteration. Our method applies to a broad family of high dimensional latent variable models, including high dimensional Gaussian mixture and mixtures of sparse regressions. A thorough theoretical investigation is further conducted. In particular, we show that the final decomposition estimator is guaranteed to achieve a local statistical rate, and further strengthen it to the global statistical rate by introducing a proper initialization procedure. In high dimensional regimes, the obtained statistical rate significantly improves those shown in the existing non-sparse decomposition methods. The empirical advantages of TTP are confirmed in extensive simulated results and two real applications of click-through rate prediction and high-dimensional gene clustering.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/30/2014

High Dimensional Expectation-Maximization Algorithm: Statistical Optimization and Asymptotic Normality

We provide a general theory of the expectation-maximization (EM) algorit...
research
06/14/2015

Fast and Guaranteed Tensor Decomposition via Sketching

Tensor CANDECOMP/PARAFAC (CP) decomposition has wide applications in sta...
research
11/06/2014

Analyzing Tensor Power Method Dynamics in Overcomplete Regime

We present a novel analysis of the dynamics of tensor power iterations i...
research
10/29/2012

Tensor decompositions for learning latent variable models

This work considers a computationally and statistically efficient parame...
research
06/19/2015

Doubly Decomposing Nonparametric Tensor Regression

Nonparametric extension of tensor regression is proposed. Nonlinearity i...
research
08/03/2014

Sample Complexity Analysis for Learning Overcomplete Latent Variable Models through Tensor Methods

We provide guarantees for learning latent variable models emphasizing on...
research
11/07/2022

Lower Bounds for the Convergence of Tensor Power Iteration on Random Overcomplete Models

Tensor decomposition serves as a powerful primitive in statistics and ma...

Please sign up or login with your details

Forgot password? Click here to reset