Bayesian Sparse Tucker Models for Dimension Reduction and Tensor Completion

05/10/2015
by   Qibin Zhao, et al.
0

Tucker decomposition is the cornerstone of modern machine learning on tensorial data analysis, which have attracted considerable attention for multiway feature extraction, compressive sensing, and tensor completion. The most challenging problem is related to determination of model complexity (i.e., multilinear rank), especially when noise and missing data are present. In addition, existing methods cannot take into account uncertainty information of latent factors, resulting in low generalization performance. To address these issues, we present a class of probabilistic generative Tucker models for tensor decomposition and completion with structural sparsity over multilinear latent space. To exploit structural sparse modeling, we introduce two group sparsity inducing priors by hierarchial representation of Laplace and Student-t distributions, which facilitates fully posterior inference. For model learning, we derived variational Bayesian inferences over all model (hyper)parameters, and developed efficient and scalable algorithms based on multilinear operations. Our methods can automatically adapt model complexity and infer an optimal multilinear rank by the principle of maximum lower bound of model evidence. Experimental results and comparisons on synthetic, chemometrics and neuroimaging data demonstrate remarkable performance of our models for recovering ground-truth of multilinear rank and missing entries.

READ FULL TEXT
research
01/25/2014

Bayesian CP Factorization of Incomplete Tensors with Automatic Rank Determination

CANDECOMP/PARAFAC (CP) tensor factorization of incomplete data is a powe...
research
10/09/2014

Bayesian Robust Tensor Factorization for Incomplete Multiway Data

We propose a generative model for robust tensor factorization in the pre...
research
09/07/2018

Tensor Ring Decomposition with Rank Minimization on Latent Space: An Efficient Approach for Tensor Completion

In tensor completion tasks, the traditional low-rank tensor decompositio...
research
01/31/2013

Rank regularization and Bayesian inference for tensor completion and extrapolation

A novel regularizer of the PARAFAC decomposition factors capturing the t...
research
05/27/2019

Tuning Free Rank-Sparse Bayesian Matrix and Tensor Completion with Global-Local Priors

Matrix and tensor completion are frameworks for a wide range of problems...
research
09/05/2020

Towards Probabilistic Tensor Canonical Polyadic Decomposition 2.0: Automatic Tensor Rank Learning Using Generalized Hyperbolic Prior

Tensor rank learning for canonical polyadic decomposition (CPD) has long...
research
05/28/2022

Rethinking Bayesian Learning for Data Analysis: The Art of Prior and Inference in Sparsity-Aware Modeling

Sparse modeling for signal processing and machine learning has been at t...

Please sign up or login with your details

Forgot password? Click here to reset