Automatic Pruning for Quantized Neural Networks

02/03/2020
by   Luis Guerra, et al.
4

Neural network quantization and pruning are two techniques commonly used to reduce the computational complexity and memory footprint of these models for deployment. However, most existing pruning strategies operate on full-precision and cannot be directly applied to discrete parameter distributions after quantization. In contrast, we study a combination of these two techniques to achieve further network compression. In particular, we propose an effective pruning strategy for selecting redundant low-precision filters. Furthermore, we leverage Bayesian optimization to efficiently determine the pruning ratio for each layer. We conduct extensive experiments on CIFAR-10 and ImageNet with various architectures and precisions. In particular, for ResNet-18 on ImageNet, we prune 26.12 achieving a top-1 classification accuracy of 47.32 59.30

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/23/2021

Pruning Ternary Quantization

We propose pruning ternary quantization (PTQ), a simple, yet effective, ...
research
07/06/2023

Pruning vs Quantization: Which is Better?

Neural network pruning and quantization techniques are almost as old as ...
research
10/05/2020

Joint Pruning Quantization for Extremely Sparse Neural Networks

We investigate pruning and quantization for deep neural networks. Our go...
research
11/06/2019

A Programmable Approach to Model Compression

Deep neural networks frequently contain far more weights, represented at...
research
12/20/2019

EAST: Encoding-Aware Sparse Training for Deep Memory Compression of ConvNets

The implementation of Deep Convolutional Neural Networks (ConvNets) on t...
research
06/29/2019

Dissecting Pruned Neural Networks

Pruning is a standard technique for removing unnecessary structure from ...
research
12/09/2022

AP: Selective Activation for De-sparsifying Pruned Neural Networks

The rectified linear unit (ReLU) is a highly successful activation funct...

Please sign up or login with your details

Forgot password? Click here to reset