
Highorder Tensor Completion for Data Recovery via Sparse Tensortrain Optimization
In this paper, we aim at the problem of tensor data completion. Tensortrain decomposition is adopted because of its powerful representation performance and tensor order linear scalability. We propose an algorithm named STTO (Sparse Tensortrain Optimization) which considers incomplete data as sparse tensor and uses firstorder optimization method to find the factors of tensortrain decomposition. Our algorithm is shown to perform well in simulation experiments at both loworder cases and highorder cases. We also employ a tensorization method to transform data to a higherorder to enhance the performance of our algorithm. The image recovery experiment results in various cases manifest that our method outperforms other completion algorithms. Especially when the missing rate is very high, e.g. 90 performance than other stateoftheart methods.
11/07/2017 ∙ by Longhao Yuan, et al. ∙ 0 ∙ shareread it

Completion of High Order Tensor Data with Missing Entries via Tensortrain Decomposition
In this paper, we aim at the completion problem of high order tensor data with missing entries. The existing tensor factorization and completion methods suffer from the curse of dimensionality when the order of tensor N>>3. To overcome this problem, we propose an efficient algorithm called TTWOPT (Tensortrain Weighted OPTimization) to find the latent core tensors of tensor data and recover the missing entries. Tensortrain decomposition, which has the powerful representation ability with linear scalability to tensor order, is employed in our algorithm. The experimental results on synthetic data and natural image completion demonstrate that our method significantly outperforms the other related methods. Especially when the missing rate of data is very high, e.g., 85 other stateoftheart algorithms.
09/08/2017 ∙ by Longhao Yuan, et al. ∙ 0 ∙ shareread it

Tensor Ring Decomposition with Rank Minimization on Latent Space: An Efficient Approach for Tensor Completion
In tensor completion tasks, the traditional lowrank tensor decomposition models suffer from laborious model selection problem due to high model sensitivity. Especially for tensor ring (TR) decomposition, the number of model possibility grows exponentially with the tensor order, which makes it rather challenging to find the optimal TR decomposition. In this paper, by exploiting the lowrank structure on TR latent space, we propose a novel tensor completion method, which is robust to model selection. In contrast to imposing lowrank constraint on the data space, we introduce nuclear norm regularization on the latent TR factors, resulting in that the optimization step using singular value decomposition (SVD) can be performed at a much smaller scale. By leveraging the alternating direction method of multipliers (ADMM) scheme, the latent TR factors with optimal rank and the recovered tensor can be obtained simultaneously. Our proposed algorithm effectively alleviates the burden of TRrank selection, therefore the computational cost is greatly reduced. The extensive experimental results on synthetic data and realworld data demonstrate the superior high performance and efficiency of the proposed approach against the stateoftheart algorithms.
09/07/2018 ∙ by Longhao Yuan, et al. ∙ 0 ∙ shareread it

Higherdimension Tensor Completion via Lowrank Tensor Ring Decomposition
The problem of incomplete data is common in signal processing and machine learning. Tensor completion aims to recover an incomplete tensor data from its partially observed entries. In this paper, based on recently proposed tensor ring (TR) decomposition, we propose a new tensor completion approach named tensor ring weighted optimization (TRWOPT). It finds the latent factors of the incomplete tensor by gradient descent algorithm, then the latent factors are employed to predict the missing entries of the tensor. We conduct tensor completion experiments on synthetic data and realworld data. The simulation results show that TRWOPT performs well and is robust to highdimension tensor. Image completion results show that the proposed algorithm outperforms the stateoftheart algorithms in many situations, especially when the image is tensorized to a higherdimension, better performance can be obtained from our TRWOPT algorithm.
07/03/2018 ∙ by Longhao Yuan, et al. ∙ 0 ∙ shareread it

Rank Minimization on Tensor Ring: A New Paradigm in Scalable Tensor Decomposition and Completion
In lowrank tensor completion tasks, due to the underlying multiple largescale singular value decomposition (SVD) operations and rank selection problem of the traditional methods, they suffer from high computational cost and high sensitivity of model complexity. In this paper, taking advantages of high compressibility of the recently proposed tensor ring (TR) decomposition, we propose a new model for tensor completion problem. This is achieved through introducing convex surrogates of tensor lowrank assumption on latent tensor ring factors, which makes it possible for the Schatten norm regularization based models to be solved at much smaller scale. We propose two algorithms which apply different structured Schatten norms on tensor ring factors respectively. By the alternating direction method of multipliers (ADMM) scheme, the tensor ring factors and the predicted tensor can be optimized simultaneously. The experiments on synthetic data and realworld data show the high performance and efficiency of the proposed approach.
05/22/2018 ∙ by Longhao Yuan, et al. ∙ 0 ∙ shareread it

Randomized Tensor Ring Decomposition and Its Application to Largescale Data Reconstruction
Dimensionality reduction is an essential technique for multiway largescale data, i.e., tensor. Tensor ring (TR) decomposition has become popular due to its high representation ability and flexibility. However, the traditional TR decomposition algorithms suffer from high computational cost when facing largescale data. In this paper, taking advantages of the recently proposed tensor random projection method, we propose two TR decomposition algorithms. By employing random projection on every mode of the largescale tensor, the TR decomposition can be processed at a much smaller scale. The simulation experiment shows that the proposed algorithms are 425 times faster than traditional algorithms without loss of accuracy, and our algorithms show superior performance in deep learning dataset compression and hyperspectral image reconstruction experiments compared to other randomized algorithms.
01/07/2019 ∙ by Longhao Yuan, et al. ∙ 0 ∙ shareread it
Longhao Yuan
is this you? claim profile