On some orthogonalization schemes in Tensor Train format

11/16/2022
by   Olivier Coulaud, et al.
0

In the framework of tensor spaces, we consider orthogonalization kernels to generate an orthogonal basis of a tensor subspace from a set of linearly independent tensors. In particular, we investigate numerically the loss of orthogonality of six orthogonalization methods, namely Classical and Modified Gram-Schmidt with (CGS2, MGS2) and without (CGS, MGS) re-orthogonalization, the Gram approach, and the Householder transformation. To tackle the curse of dimensionality, we represent tensor with low rank approximation using the Tensor Train (TT) formalism, and we introduce recompression steps in the standard algorithm outline through the TT-rounding method at a prescribed accuracy. After describing the algorithm structure and properties, we illustrate numerically that the theoretical bounds for the loss of orthogonality in the classical matrix computation round-off analysis results are maintained, with the unit round-off replaced by the TT-rounding accuracy. The computational analysis for each orthogonalization kernel in terms of the memory requirement and the computational complexity measured as a function of the number of TT-rounding, which happens to be the computational most expensive operation, completes the study.

READ FULL TEXT
research
03/13/2018

Principal Component Analysis with Tensor Train Subspace

Tensor train is a hierarchical tensor network structure that helps allev...
research
10/26/2022

A robust GMRES algorithm in Tensor Train format

We consider the solution of linear systems with tensor product structure...
research
11/21/2022

Approximation in the extended functional tensor train format

This work proposes the extended functional tensor train (EFTT) format fo...
research
12/20/2019

MERACLE: Constructive layer-wise conversion of a Tensor Train into a MERA

In this article two new algorithms are presented that convert a given da...
research
10/08/2021

Randomized algorithms for rounding in the Tensor-Train format

The Tensor-Train (TT) format is a highly compact low-rank representation...
research
01/22/2021

Tensor-Train Networks for Learning Predictive Modeling of Multidimensional Data

Deep neural networks have attracted the attention of the machine learnin...
research
03/23/2020

Efficient Tensor Kernel methods for sparse regression

Recently, classical kernel methods have been extended by the introductio...

Please sign up or login with your details

Forgot password? Click here to reset