Reducing Computational Complexity of Tensor Contractions via Tensor-Train Networks

09/01/2021
by   Ilya Kisil, et al.
0

There is a significant expansion in both volume and range of applications along with the concomitant increase in the variety of data sources. These ever-expanding trends have highlighted the necessity for more versatile analysis tools that offer greater opportunities for algorithmic developments and computationally faster operations than the standard flat-view matrix approach. Tensors, or multi-way arrays, provide such an algebraic framework which is naturally suited to data of such large volume, diversity, and veracity. Indeed, the associated tensor decompositions have demonstrated their potential in breaking the Curse of Dimensionality associated with traditional matrix methods, where a necessary exponential increase in data volume leads to adverse or even intractable consequences on computational complexity. A key tool underpinning multi-linear manipulation of tensors and tensor networks is the standard Tensor Contraction Product (TCP). However, depending on the dimensionality of the underlying tensors, the TCP also comes at the price of high computational complexity in tensor manipulation. In this work, we resort to diagrammatic tensor network manipulation to calculate such products in an efficient and computationally tractable manner, by making use of Tensor Train decomposition (TTD). This has rendered the underlying concepts easy to perceive, thereby enhancing intuition of the associated underlying operations, while preserving mathematical rigour. In addition to bypassing the cumbersome mathematical multi-linear expressions, the proposed Tensor Train Contraction Product model is shown to accelerate significantly the underlying computational operations, as it is independent of tensor order and linear in the tensor dimension, as opposed to performing the full computations through the standard approach (exponential in tensor order).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/09/2018

The trouble with tensor ring decompositions

The tensor train decomposition decomposes a tensor into a "train" of 3-w...
research
08/12/2019

Efficient Contraction of Large Tensor Networks for Weighted Model Counting through Graph Decompositions

Constrained counting is a fundamental problem in artificial intelligence...
research
11/30/2021

HOTTBOX: Higher Order Tensor ToolBOX

HOTTBOX is a Python library for exploratory analysis and visualisation o...
research
07/14/2017

Communication Lower Bounds of Bilinear Algorithms for Symmetric Tensor Contractions

Accurate numerical calculations of electronic structure are often domina...
research
08/31/2021

A New Approach to Multilinear Dynamical Systems and Control

The current paper presents a new approach to multilinear dynamical syste...
research
01/04/2022

TAMM: Tensor Algebra for Many-body Methods

Tensor contraction operations in computational chemistry consume signifi...
research
10/26/2014

A Ternary Non-Commutative Latent Factor Model for Scalable Three-Way Real Tensor Completion

Motivated by large-scale Collaborative-Filtering applications, we presen...

Please sign up or login with your details

Forgot password? Click here to reset