Model Compression by Entropy Penalized Reparameterization

by   Deniz Oktay, et al.
University of Maryland

We describe an end-to-end neural network weight compression approach that draws inspiration from recent latent-variable data compression methods. The network parameters (weights and biases) are represented in a "latent" space, amounting to a reparameterization. This space is equipped with a learned probability model, which is used to impose an entropy penalty on the parameter representation during training, and to compress the representation using arithmetic coding after training. We are thus maximizing accuracy and model compressibility jointly, in an end-to-end fashion, with the rate--error trade-off specified by a hyperparameter. We evaluate our method by compressing six distinct model architectures on the MNIST, CIFAR-10 and ImageNet classification benchmarks. Our method achieves state-of-the-art compression on VGG-16, LeNet300-100 and several ResNet architectures, and is competitive on LeNet-5.


page 1

page 2

page 3

page 4


LilNetX: Lightweight Networks with EXtreme Model Compression and Structured Sparsification

We introduce LilNetX, an end-to-end trainable technique for neural netwo...

Video Coding Using Learned Latent GAN Compression

We propose in this paper a new paradigm for facial video compression. We...

Multiscale Latent-Guided Entropy Model for LiDAR Point Cloud Compression

The non-uniform distribution and extremely sparse nature of the LiDAR po...

Binary Probability Model for Learning Based Image Compression

In this paper, we propose to enhance learned image compression systems w...

Generalized Ternary Connect: End-to-End Learning and Compression of Multiplication-Free Deep Neural Networks

The use of deep neural networks in edge computing devices hinges on the ...

Architecture Compression

In this paper we propose a novel approach to model compression termed Ar...

End-to-end Learning of Compressible Features

Pre-trained convolutional neural networks (CNNs) are powerful off-the-sh...

Please sign up or login with your details

Forgot password? Click here to reset