One-dimensional Tensor Network Recovery

07/21/2022
by   Ziang Chen, et al.
0

We study the recovery of the underlying graphs or permutations for tensors in tensor ring or tensor train format. Our proposed algorithms compare the matricization ranks after down-sampling, whose complexity is O(dlog d) for d-th order tensors. We prove that our algorithms can almost surely recover the correct graph or permutation when tensor entries can be observed without noise. We further establish the robustness of our algorithms against observational noise. The theoretical results are validated by numerical experiments.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/14/2020

Tensor train construction from tensor actions, with application to compression of large high order derivative tensors

We present a method for converting tensors into tensor train format base...
research
02/19/2023

The Fréchet derivative of the tensor t-function

The tensor t-function, a formalism that generalizes the well-known conce...
research
04/22/2020

Hierarchical Tensor Ring Completion

Tensor completion can estimate missing values of a high-order data from ...
research
09/05/2018

Learning Paths from Signature Tensors

Matrix congruence extends naturally to the setting of tensors. We apply ...
research
05/17/2019

Tensor Ring Decomposition: Energy Landscape and One-loop Convergence of Alternating Least Squares

In this work, we study the tensor ring decomposition and its associated ...
research
11/17/2020

Geometry of tree-based tensor formats in tensor Banach spaces

In the paper `On the Dirac-Frenkel Variational Principle on Tensor Banac...
research
10/15/2022

Approximate Graph Colouring and Crystals

We show that approximate graph colouring is not solved by any level of t...

Please sign up or login with your details

Forgot password? Click here to reset