Analyzing Large and Sparse Tensor Data using Spectral Low-Rank Approximation

12/14/2020
by   L. Eldén, et al.
0

Information is extracted from large and sparse data sets organized as 3-mode tensors. Two methods are described, based on best rank-(2,2,2) and rank-(2,2,1) approximation of the tensor. The first method can be considered as a generalization of spectral graph partitioning to tensors, and it gives a reordering of the tensor that clusters the information. The second method gives an expansion of the tensor in sparse rank-(2,2,1) terms, where the terms correspond to graphs. The low-rank approximations are computed using an efficient Krylov-Schur type algorithm that avoids filling in the sparse data. The methods are applied to topic search in news text, a tensor representing conference author-terms-years, and network traffic logs.

READ FULL TEXT

page 8

page 18

page 20

page 21

research
12/14/2020

Spectral Partitioning of Large and Sparse Tensors using Low-Rank Tensor Approximation

The problem of partitioning a large and sparse tensor is considered, whe...
research
05/25/2020

Algebraic Methods for Tensor Data

We develop algebraic methods for computations with tensor data. We give ...
research
09/14/2020

Continuous dictionaries meet low-rank tensor approximations

In this short paper we bridge two seemingly unrelated sparse approximati...
research
09/30/2022

Many-Body Approximation for Tensors

We propose a nonnegative tensor decomposition with focusing on the relat...
research
09/01/2022

Learning Generative Embeddings using an Optimal Subsampling Policy for Tensor Sketching

Data tensors of orders 3 and greater are routinely being generated. Thes...
research
08/26/2020

Low Tensor Train- and Low Multilinear Rank Approximations for De-speckling and Compression of 3D Optical Coherence Tomography Images

This paper proposes low tensor-train (TT) rank and low multilinear (ML) ...
research
04/02/2016

Embedding Lexical Features via Low-Rank Tensors

Modern NLP models rely heavily on engineered features, which often combi...

Please sign up or login with your details

Forgot password? Click here to reset