Randomized algorithms for low-rank tensor decompositions in the Tucker format

05/17/2019
by   Rachel Minster, et al.
0

Many applications in data science and scientific computing involve large-scale datasets that are expensive to store and compute with, but can be efficiently compressed and stored in an appropriate tensor format. In recent years, randomized matrix methods have been used to efficiently and accurately compute low-rank matrix decompositions. Motivated by this success, we focus on developing randomized algorithms for tensor decompositions in the Tucker representation. Specifically, we present randomized versions of two well-known compression algorithms, namely, HOSVD and STHOSVD. We present a detailed probabilistic analysis of the error of the randomized tensor algorithms. We also develop variants of these algorithms that tackle specific challenges posed by large-scale datasets. The first variant adaptively finds a low-rank representation satisfying a given tolerance and it is beneficial when the target-rank is not known in advance. The second variant preserves the structure of the original tensor, and is beneficial for large sparse tensors that are difficult to load in memory. We consider several different datasets for our numerical experiments: synthetic test tensors and realistic applications such as the compression of facial image samples in the Olivetti database and word counts in the Enron email dataset.

READ FULL TEXT
research
07/27/2021

Efficient randomized tensor-based algorithms for function approximation and low-rank kernel interactions

In this paper, we introduce a method for multivariate function approxima...
research
05/03/2021

Structured Matrix Approximations via Tensor Decompositions

We provide a computational framework for approximating a class of struct...
research
08/07/2019

Faster Tensor Train Decomposition for Sparse Data

In recent years, the application of tensors has become more widespread i...
research
10/08/2021

Randomized algorithms for rounding in the Tensor-Train format

The Tensor-Train (TT) format is a highly compact low-rank representation...
research
03/19/2021

Mode-wise Tensor Decompositions: Multi-dimensional Generalizations of CUR Decompositions

Low rank tensor approximation is a fundamental tool in modern machine le...
research
01/29/2021

Performance of the low-rank tensor-train SVD (TT-SVD) for large dense tensors on modern multi-core CPUs

There are several factorizations of multi-dimensional tensors into lower...
research
11/27/2022

Towards Efficient and Accurate Approximation: Tensor Decomposition Based on Randomized Block Krylov Iteration

Efficient and accurate low-rank approximation (LRA) methods are of great...

Please sign up or login with your details

Forgot password? Click here to reset