Tensor Ring Decomposition with Rank Minimization on Latent Space: An Efficient Approach for Tensor Completion

09/07/2018
by   Longhao Yuan, et al.
0

In tensor completion tasks, the traditional low-rank tensor decomposition models suffer from laborious model selection problem due to high model sensitivity. Especially for tensor ring (TR) decomposition, the number of model possibility grows exponentially with the tensor order, which makes it rather challenging to find the optimal TR decomposition. In this paper, by exploiting the low-rank structure on TR latent space, we propose a novel tensor completion method, which is robust to model selection. In contrast to imposing low-rank constraint on the data space, we introduce nuclear norm regularization on the latent TR factors, resulting in that the optimization step using singular value decomposition (SVD) can be performed at a much smaller scale. By leveraging the alternating direction method of multipliers (ADMM) scheme, the latent TR factors with optimal rank and the recovered tensor can be obtained simultaneously. Our proposed algorithm effectively alleviates the burden of TR-rank selection, therefore the computational cost is greatly reduced. The extensive experimental results on synthetic data and real-world data demonstrate the superior high performance and efficiency of the proposed approach against the state-of-the-art algorithms.

READ FULL TEXT
research
05/22/2018

Rank Minimization on Tensor Ring: A New Paradigm in Scalable Tensor Decomposition and Completion

In low-rank tensor completion tasks, due to the underlying multiple larg...
research
06/28/2018

Automatic Rank Selection for High-Speed Convolutional Neural Network

Low-rank decomposition plays a central role in accelerating convolutiona...
research
05/19/2023

A Novel Tensor Factorization-Based Method with Robustness to Inaccurate Rank Estimation

This study aims to solve the over-reliance on the rank estimation strate...
research
05/24/2012

Linearized Alternating Direction Method with Adaptive Penalty and Warm Starts for Fast Solving Transform Invariant Low-Rank Textures

Transform Invariant Low-rank Textures (TILT) is a novel and powerful too...
research
05/10/2015

Bayesian Sparse Tucker Models for Dimension Reduction and Tensor Completion

Tucker decomposition is the cornerstone of modern machine learning on te...
research
01/08/2021

A Bayesian Approach to Block-Term Tensor Decomposition Model Selection and Computation

The so-called block-term decomposition (BTD) tensor model, especially in...
research
06/20/2021

Online Rank-Revealing Block-Term Tensor Decomposition

The so-called block-term decomposition (BTD) tensor model, especially in...

Please sign up or login with your details

Forgot password? Click here to reset