Jianting Cao

is this you? claim profile


  • High-order Tensor Completion for Data Recovery via Sparse Tensor-train Optimization

    In this paper, we aim at the problem of tensor data completion. Tensor-train decomposition is adopted because of its powerful representation performance and tensor order linear scalability. We propose an algorithm named STTO (Sparse Tensor-train Optimization) which considers incomplete data as sparse tensor and uses first-order optimization method to find the factors of tensor-train decomposition. Our algorithm is shown to perform well in simulation experiments at both low-order cases and high-order cases. We also employ a tensorization method to transform data to a higher-order to enhance the performance of our algorithm. The image recovery experiment results in various cases manifest that our method outperforms other completion algorithms. Especially when the missing rate is very high, e.g. 90 performance than other state-of-the-art methods.

    11/07/2017 ∙ by Longhao Yuan, et al. ∙ 0 share

    read it

  • High-dimension Tensor Completion via Gradient-based Optimization Under Tensor-train Format

    In this paper, we propose a novel approach to recover the missing entries of incomplete data represented by a high-dimension tensor. Tensor-train decomposition, which has powerful tensor representation ability and is free from `the curse of dimensionality', is employed in our approach. By observed entries of incomplete data, we consider to find the factors which can capture the latent features of the data and then reconstruct the missing entries. With low-rank assumption to the original data, tensor completion problem is cast into solving optimization models. Gradient descent methods are applied to optimize the core tensors of tensor-train decomposition. We propose two algorithms: Tensor-train Weighted Optimization (TT-WOPT) and Tensor-train Stochastic Gradient Descent (TT-SGD) to solve tensor completion problems. A high-order tensorization method named visual data tensorization (VDT) is proposed to transform visual data to higher-order forms by which the performance of our algorithms can be improved. The synthetic data experiments and visual data experiments show that our algorithms outperform the state-of-the-art completion algorithms. Especially in high-dimension, high missing rate and large-scale data cases, significant performance can be obtained from our algorithms.

    04/05/2018 ∙ by Longhao Yuan. Qibin Zhao, et al. ∙ 0 share

    read it

  • Completion of High Order Tensor Data with Missing Entries via Tensor-train Decomposition

    In this paper, we aim at the completion problem of high order tensor data with missing entries. The existing tensor factorization and completion methods suffer from the curse of dimensionality when the order of tensor N>>3. To overcome this problem, we propose an efficient algorithm called TT-WOPT (Tensor-train Weighted OPTimization) to find the latent core tensors of tensor data and recover the missing entries. Tensor-train decomposition, which has the powerful representation ability with linear scalability to tensor order, is employed in our algorithm. The experimental results on synthetic data and natural image completion demonstrate that our method significantly outperforms the other related methods. Especially when the missing rate of data is very high, e.g., 85 other state-of-the-art algorithms.

    09/08/2017 ∙ by Longhao Yuan, et al. ∙ 0 share

    read it

  • Tensor Ring Decomposition with Rank Minimization on Latent Space: An Efficient Approach for Tensor Completion

    In tensor completion tasks, the traditional low-rank tensor decomposition models suffer from laborious model selection problem due to high model sensitivity. Especially for tensor ring (TR) decomposition, the number of model possibility grows exponentially with the tensor order, which makes it rather challenging to find the optimal TR decomposition. In this paper, by exploiting the low-rank structure on TR latent space, we propose a novel tensor completion method, which is robust to model selection. In contrast to imposing low-rank constraint on the data space, we introduce nuclear norm regularization on the latent TR factors, resulting in that the optimization step using singular value decomposition (SVD) can be performed at a much smaller scale. By leveraging the alternating direction method of multipliers (ADMM) scheme, the latent TR factors with optimal rank and the recovered tensor can be obtained simultaneously. Our proposed algorithm effectively alleviates the burden of TR-rank selection, therefore the computational cost is greatly reduced. The extensive experimental results on synthetic data and real-world data demonstrate the superior high performance and efficiency of the proposed approach against the state-of-the-art algorithms.

    09/07/2018 ∙ by Longhao Yuan, et al. ∙ 0 share

    read it

  • Higher-dimension Tensor Completion via Low-rank Tensor Ring Decomposition

    The problem of incomplete data is common in signal processing and machine learning. Tensor completion aims to recover an incomplete tensor data from its partially observed entries. In this paper, based on recently proposed tensor ring (TR) decomposition, we propose a new tensor completion approach named tensor ring weighted optimization (TR-WOPT). It finds the latent factors of the incomplete tensor by gradient descent algorithm, then the latent factors are employed to predict the missing entries of the tensor. We conduct tensor completion experiments on synthetic data and real-world data. The simulation results show that TR-WOPT performs well and is robust to high-dimension tensor. Image completion results show that the proposed algorithm outperforms the state-of-the-art algorithms in many situations, especially when the image is tensorized to a higher-dimension, better performance can be obtained from our TR-WOPT algorithm.

    07/03/2018 ∙ by Longhao Yuan, et al. ∙ 0 share

    read it

  • Rank Minimization on Tensor Ring: A New Paradigm in Scalable Tensor Decomposition and Completion

    In low-rank tensor completion tasks, due to the underlying multiple large-scale singular value decomposition (SVD) operations and rank selection problem of the traditional methods, they suffer from high computational cost and high sensitivity of model complexity. In this paper, taking advantages of high compressibility of the recently proposed tensor ring (TR) decomposition, we propose a new model for tensor completion problem. This is achieved through introducing convex surrogates of tensor low-rank assumption on latent tensor ring factors, which makes it possible for the Schatten norm regularization based models to be solved at much smaller scale. We propose two algorithms which apply different structured Schatten norms on tensor ring factors respectively. By the alternating direction method of multipliers (ADMM) scheme, the tensor ring factors and the predicted tensor can be optimized simultaneously. The experiments on synthetic data and real-world data show the high performance and efficiency of the proposed approach.

    05/22/2018 ∙ by Longhao Yuan, et al. ∙ 0 share

    read it

  • Randomized Tensor Ring Decomposition and Its Application to Large-scale Data Reconstruction

    Dimensionality reduction is an essential technique for multi-way large-scale data, i.e., tensor. Tensor ring (TR) decomposition has become popular due to its high representation ability and flexibility. However, the traditional TR decomposition algorithms suffer from high computational cost when facing large-scale data. In this paper, taking advantages of the recently proposed tensor random projection method, we propose two TR decomposition algorithms. By employing random projection on every mode of the large-scale tensor, the TR decomposition can be processed at a much smaller scale. The simulation experiment shows that the proposed algorithms are 4-25 times faster than traditional algorithms without loss of accuracy, and our algorithms show superior performance in deep learning dataset compression and hyperspectral image reconstruction experiments compared to other randomized algorithms.

    01/07/2019 ∙ by Longhao Yuan, et al. ∙ 0 share

    read it