Quantisation and Pruning for Neural Network Compression and Regularisation

01/14/2020
by   Kimessha Paupamah, et al.
0

Deep neural networks are typically too computationally expensive to run in real-time on consumer-grade hardware and low-powered devices. In this paper, we investigate reducing the computational and memory requirements of neural networks through network pruning and quantisation. We examine their efficacy on large networks like AlexNet compared to recent compact architectures: ShuffleNet and MobileNet. Our results show that pruning and quantisation compresses these networks to less than half their original size and improves their efficiency, particularly on MobileNet with a 7x speedup. We also demonstrate that pruning, in addition to reducing the number of parameters in a network, can aid in the correction of overfitting.

READ FULL TEXT
research
01/13/2020

Modeling of Pruning Techniques for Deep Neural Networks Simplification

Convolutional Neural Networks (CNNs) suffer from different issues, such ...
research
05/19/2018

Sparse Architectures for Text-Independent Speaker Verification Using Deep Neural Networks

Network pruning is of great importance due to the elimination of the uni...
research
08/12/2023

Can Unstructured Pruning Reduce the Depth in Deep Neural Networks?

Pruning is a widely used technique for reducing the size of deep neural ...
research
06/09/2019

The Generalization-Stability Tradeoff in Neural Network Pruning

Pruning neural network parameters to reduce model size is an area of muc...
research
08/09/2023

FPGA Resource-aware Structured Pruning for Real-Time Neural Networks

Neural networks achieve state-of-the-art performance in image classifica...
research
08/19/2021

Pruning in the Face of Adversaries

The vulnerability of deep neural networks against adversarial examples -...
research
10/24/2019

A Comparative Study of Neural Network Compression

There has recently been an increasing desire to evaluate neural networks...

Please sign up or login with your details

Forgot password? Click here to reset