Latent Matrices for Tensor Network Decomposition and to Tensor Completion

10/07/2022
by   Peilin Yang, et al.
0

The prevalent fully-connected tensor network (FCTN) has achieved excellent success to compress data. However, the FCTN decomposition suffers from slow computational speed when facing higher-order and large-scale data. Naturally, there arises an interesting question: can a new model be proposed that decomposes the tensor into smaller ones and speeds up the computation of the algorithm? This work gives a positive answer by formulating a novel higher-order tensor decomposition model that utilizes latent matrices based on the tensor network structure, which can decompose a tensor into smaller-scale data than the FCTN decomposition, hence we named it Latent Matrices for Tensor Network Decomposition (LMTN). Furthermore, three optimization algorithms, LMTN-PAM, LMTN-SVD and LMTN-AR, have been developed and applied to the tensor-completion task. In addition, we provide proofs of theoretical convergence and complexity analysis for these algorithms. Experimental results show that our algorithm has the effectiveness in both deep learning dataset compression and higher-order tensor completion, and that our LMTN-SVD algorithm is 3-6 times faster than the FCTN-PAM algorithm and only a 1.8 points accuracy drop.

READ FULL TEXT

page 8

page 12

page 14

research
04/04/2022

A high-order tensor completion algorithm based on Fully-Connected Tensor Network weighted optimization

Tensor completion aimes at recovering missing data, and it is one of the...
research
07/02/2013

Novel Factorization Strategies for Higher Order Tensors: Implications for Compression and Recovery of Multi-linear Data

In this paper we propose novel methods for compression and recovery of m...
research
02/13/2020

Multiresolution Tensor Learning for Efficient and Interpretable Spatial Analysis

Efficient and interpretable spatial analysis is crucial in many fields s...
research
05/24/2023

SVDinsTN: An Integrated Method for Tensor Network Representation with Efficient Structure Search

Tensor network (TN) representation is a powerful technique for data anal...
research
03/30/2021

Higher-Order Neighborhood Truss Decomposition

k-truss model is a typical cohesive subgraph model and has been received...
research
02/19/2018

Multi-resolution Tensor Learning for Large-Scale Spatial Data

High-dimensional tensor models are notoriously computationally expensive...
research
03/14/2019

Tucker Tensor Layer in Fully Connected Neural Networks

We introduce the Tucker Tensor Layer (TTL), an alternative to the dense ...

Please sign up or login with your details

Forgot password? Click here to reset