Structured Matrix Approximations via Tensor Decompositions

05/03/2021
by   Misha E. Kilmer, et al.
0

We provide a computational framework for approximating a class of structured matrices; here, the term structure is very general, and may refer to a regular sparsity pattern (e.g., block-banded), or be more highly structured (e.g., symmetric block Toeplitz). The goal is to uncover additional latent structure that will in turn lead to computationally efficient algorithms when the new structured matrix approximations are employed in the place of the original operator. Our approach has three steps: map the structured matrix to tensors, use tensor compression algorithms, and map the compressed tensors back to obtain two different matrix representations – sum of Kronecker products and block low-rank format. The use of tensor decompositions enables us to uncover latent structure in the problem and leads to compressed representations of the original matrix that can be used efficiently in applications. The resulting matrix approximations are memory efficient, easy to compute with, and preserve the error that is due to the tensor compression in the Frobenius norm. Our framework is quite general. We illustrate the ability of our method to uncover block-low-rank format on structured matrices from two applications: system identification, space-time covariance matrices. In addition, we demonstrate that our approach can uncover sum of structured Kronecker products structure on several matrices from the SuiteSparse collection. Finally, we show that our framework is broad enough to encompass and improve on other related results from the literature, as we illustrate with the approximation of a three-dimensional blurring operator.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/17/2019

Randomized algorithms for low-rank tensor decompositions in the Tucker format

Many applications in data science and scientific computing involve large...
research
08/29/2019

Multi-resolution Low-rank Tensor Formats

We describe a simple, black-box compression format for tensors with a mu...
research
10/28/2020

Matrix and tensor rigidity and L_p-approximation

In this note we make two observations. First: the low-rank approximation...
research
03/19/2021

Mode-wise Tensor Decompositions: Multi-dimensional Generalizations of CUR Decompositions

Low rank tensor approximation is a fundamental tool in modern machine le...
research
02/14/2021

Doping: A technique for efficient compression of LSTM models using sparse structured additive matrices

Structured matrices, such as those derived from Kronecker products (KP),...
research
08/26/2020

Low Tensor Train- and Low Multilinear Rank Approximations for De-speckling and Compression of 3D Optical Coherence Tomography Images

This paper proposes low tensor-train (TT) rank and low multilinear (ML) ...
research
09/25/2020

On the Compression of Translation Operator Tensors in FMM-FFT-Accelerated SIE Simulators via Tensor Decompositions

Tensor decomposition methodologies are proposed to reduce the memory req...

Please sign up or login with your details

Forgot password? Click here to reset