Wide Compression: Tensor Ring Nets

02/25/2018
by   Wenqi Wang, et al.
0

Deep neural networks have demonstrated state-of-the-art performance in a variety of real-world applications. In order to obtain performance gains, these networks have grown larger and deeper, containing millions or even billions of parameters and over a thousand layers. The trade-off is that these large architectures require an enormous amount of memory, storage, and computation, thus limiting their usability. Inspired by the recent tensor ring factorization, we introduce Tensor Ring Networks (TR-Nets), which significantly compress both the fully connected layers and the convolutional layers of deep neural networks. Our results show that our TR-Nets approach is able to compress LeNet-5 by 11× without losing accuracy, and can compress the state-of-the-art Wide ResNet by 243× with only 2.3% degradation in Cifar10 image classification. Overall, this compression scheme shows promise in scientific computing and deep learning, especially for emerging resource-constrained devices such as smartphones, wearables, and IoT devices.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/12/2021

Nonlinear Tensor Ring Network

The state-of-the-art deep neural networks (DNNs) have been widely applie...
research
05/25/2018

Tensorized Spectrum Preserving Compression for Neural Networks

Modern neural networks can have tens of millions of parameters, and are ...
research
09/29/2015

Compression of Deep Neural Networks on the Fly

Thanks to their state-of-the-art performance, deep neural networks are i...
research
04/11/2021

TedNet: A Pytorch Toolkit for Tensor Decomposition Networks

Tensor Decomposition Networks(TDNs) prevail for their inherent compact a...
research
12/15/2017

BT-Nets: Simplifying Deep Neural Networks via Block Term Decomposition

Recently, deep neural networks (DNNs) have been regarded as the state-of...
research
12/07/2017

AdaComp : Adaptive Residual Gradient Compression for Data-Parallel Distributed Training

Highly distributed training of Deep Neural Networks (DNNs) on future com...
research
10/15/2019

Reduced-Order Modeling of Deep Neural Networks

We introduce a new method for speeding up the inference of deep neural n...

Please sign up or login with your details

Forgot password? Click here to reset