Almost-lossless compression of a low-rank random tensor

10/08/2022
by   Minh Thành Vu, et al.
0

In this work, we establish an asymptotic limit of almost-lossless compression of a random, finite alphabet tensor which admits a low-rank canonical polyadic decomposition.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/24/2019

One time is not enough: iterative tensor decomposition for neural network compression

The low-rank tensor approximation is very promising for the compression ...
research
11/18/2018

The core consistency of a compressed tensor

Tensor decomposition on big data has attracted significant attention rec...
research
05/22/2018

Low-Rank Tensor Decomposition via Multiple Reshaping and Reordering Operations

Tensor decomposition has been widely applied to find low-rank representa...
research
02/01/2023

Experimental observation on a low-rank tensor model for eigenvalue problems

Here we utilize a low-rank tensor model (LTM) as a function approximator...
research
12/27/2017

Tensor Regression Networks with various Low-Rank Tensor Approximations

Tensor regression networks achieve high rate of compression of model par...
research
02/11/2023

Optimizing Orthogonalized Tensor Deflation via Random Tensor Theory

This paper tackles the problem of recovering a low-rank signal tensor wi...
research
12/07/2021

Low-rank Tensor Decomposition for Compression of Convolutional Neural Networks Using Funnel Regularization

Tensor decomposition is one of the fundamental technique for model compr...

Please sign up or login with your details

Forgot password? Click here to reset