DropPruning for Model Compression

12/05/2018
by   Haipeng Jia, et al.
0

Deep neural networks (DNNs) have dramatically achieved great success on a variety of challenging tasks. However, most of the successful DNNs are structurally so complex, leading to much storage requirement and floating-point operation. This paper proposes a novel technique, named Drop Pruning, to compress the DNNs by pruning the weights from a dense high-accuracy baseline model without accuracy loss. Drop Pruning also falls into the standard iterative prune-retrain procedure, where a drop strategy exists at each pruning step: drop out, stochastic deleting some unimportant weights and drop in, stochastic recovering some pruned weights. Drop out and drop in are supposed to handle the two drawbacks of the traditional pruning methods: local importance judgment and irretrievable pruning process, respectively. The suitable choosing of drop probabilities can decrease the model size during pruning process and lead it to flow to the target sparsity. Drop Pruning also has some similar spirits with dropout, a stochastic algorithm in Integer Optimization and the Dense-Sparse-Dense training technique. Drop Pruning can significantly reducing overfitting while compressing the model. Experimental results demonstrates that Drop Pruning can achieve the state-of-the-art performance on many benchmark pruning tasks, about 11.1× compression of VGG-16 on CIFAR10 and 14.3× compression of LeNet-5 on MNIST without accuracy loss, which may provide some new insights into the aspect of model compression.

READ FULL TEXT
research
04/12/2022

Neural Network Pruning by Cooperative Coevolution

Neural network pruning is a popular model compression method which can s...
research
09/18/2022

Pruning Neural Networks via Coresets and Convex Geometry: Towards No Assumptions

Pruning is one of the predominant approaches for compressing deep neural...
research
03/14/2023

Sr-init: An interpretable layer pruning method

Despite the popularization of deep neural networks (DNNs) in many fields...
research
06/07/2020

EDropout: Energy-Based Dropout and Pruning of Deep Neural Networks

Dropout is a well-known regularization method by sampling a sub-network ...
research
07/28/2022

CrAM: A Compression-Aware Minimizer

We examine the question of whether SGD-based optimization of deep neural...
research
12/05/2017

Automated Pruning for Deep Neural Network Compression

In this work we present a method to improve the pruning step of the curr...
research
10/05/2017

To prune, or not to prune: exploring the efficacy of pruning for model compression

Model pruning seeks to induce sparsity in a deep neural network's variou...

Please sign up or login with your details

Forgot password? Click here to reset