tbasis
T-Basis: a Compact Representation for Neural Networks
view repo
We introduce T-Basis, a novel concept for a compact representation of a set of tensors, each of an arbitrary shape, which is often seen in Neural Networks. Each of the tensors in the set is modeled using Tensor Rings, though the concept applies to other Tensor Networks. Owing its name to the T-shape of nodes in diagram notation of Tensor Rings, T-Basis is simply a list of equally shaped three-dimensional tensors, used to represent Tensor Ring nodes. Such representation allows us to parameterize the tensor set with a small number of parameters (coefficients of the T-Basis tensors), scaling logarithmically with each tensor's size in the set and linearly with the dimensionality of T-Basis. We evaluate the proposed approach on the task of neural network compression and demonstrate that it reaches high compression rates at acceptable performance drops. Finally, we analyze memory and operation requirements of the compressed networks and conclude that T-Basis networks are equally well suited for training and inference in resource-constrained environments and usage on the edge devices.
READ FULL TEXT
Recent AI applications such as Collaborative Intelligence with neural
ne...
read it
The era of exascale computing opens new venues for innovations and
disco...
read it
We propose a strategy to compress and store large volumes of scientific ...
read it
We describe a simple, black-box compression format for tensors with a
mu...
read it
Deep neural networks have attracted the attention of the machine learnin...
read it
We analyze linear independence of rank one tensors produced by tensor po...
read it
This tutorial material on Convolutional Neural Networks (CNN) and its
ap...
read it
T-Basis: a Compact Representation for Neural Networks
Comments
There are no comments yet.