Towards Probabilistic Tensor Canonical Polyadic Decomposition 2.0: Automatic Tensor Rank Learning Using Generalized Hyperbolic Prior

09/05/2020
by   Lei Cheng, et al.
2

Tensor rank learning for canonical polyadic decomposition (CPD) has long been deemed as an essential but challenging problem. In particular, since the tensor rank controls the complexity of the CPD model, its inaccurate learning would cause overfitting to noise or underfitting to the signal sources, and even destroy the interpretability of model parameters. However, the optimal determination of a tensor rank is known to be a non-deterministic polynomial-time hard (NP-hard) task. Rather than exhaustively searching for the best tensor rank via trial-and-error experiments, Bayesian inference under the Gaussian-gamma prior was introduced in the context of probabilistic CPD modeling and it was shown to be an effective strategy for automatic tensor rank determination. This triggered flourishing research on other structured tensor CPDs with automatic tensor rank learning. As the other side of the coin, these research works also reveal that the Gaussian-gamma model does not perform well for high-rank tensors or/and low signal-to-noise ratios (SNRs). To overcome these drawbacks, in this paper, we introduce a more advanced generalized hyperbolic (GH) prior to the probabilistic CPD model, which not only includes the Gaussian-gamma model as a special case, but also provides more flexibilities to adapt to different levels of sparsity. Based on this novel probabilistic model, an algorithm is developed under the framework of variational inference, where each update is obtained in a closed-form. Extensive numerical results, using synthetic data and real-world datasets, demonstrate the excellent performance of the proposed method in learning both low as well as high tensor ranks even for low SNR cases.

READ FULL TEXT

page 1

page 2

page 9

page 13

research
01/10/2023

Tensor Denoising via Amplification and Stable Rank Methods

Tensors in the form of multilinear arrays are ubiquitous in data science...
research
06/24/2022

Variational Bayesian inference for CP tensor completion with side information

We propose a message passing algorithm, based on variational Bayesian in...
research
03/04/2021

A Closed Form Solution to Best Rank-1 Tensor Approximation via KL divergence Minimization

Tensor decomposition is a fundamentally challenging problem. Even the si...
research
11/16/2022

On the Accuracy of Hotelling-Type Tensor Deflation: A Random Tensor Analysis

Leveraging on recent advances in random tensor theory, we consider in th...
research
10/17/2020

End-to-End Variational Bayesian Training of Tensorized Neural Networks with Automatic Rank Determination

Low-rank tensor decomposition is one of the most effective approaches to...
research
01/08/2021

A Bayesian Approach to Block-Term Tensor Decomposition Model Selection and Computation

The so-called block-term decomposition (BTD) tensor model, especially in...
research
05/10/2015

Bayesian Sparse Tucker Models for Dimension Reduction and Tensor Completion

Tucker decomposition is the cornerstone of modern machine learning on te...

Please sign up or login with your details

Forgot password? Click here to reset