Progressive Deep Neural Networks Acceleration via Soft Filter Pruning

08/22/2018
by   Yang He, et al.
0

This paper proposed a Progressive Soft Filter Pruning method (PSFP) to prune the filters of deep Neural Networks which can thus be accelerated in the inference. Specifically, the proposed PSFP method prunes the network progressively and enables the pruned filters to be updated when training the model after pruning. PSFP has three advantages over previous works: 1) Larger model capacity. Updating previously pruned filters provides our approach with larger optimization space than fixing the filters to zero. Therefore, the network trained by our method has a larger model capacity to learn from the training data. 2) Less dependence on the pre-trained model. Large capacity enables our method to train from scratch and prune the model simultaneously. In contrast, previous filter pruning methods should be conducted on the basis of the pre-trained model to guarantee their performance. Empirically, PSFP from scratch outperforms the previous filter pruning methods. 3) Pruning the neural network progressively makes the selection of low-norm filters much more stable, which has a potential to get a better performance. Moreover, our approach has been demonstrated effective for many advanced CNN architectures. Notably, on ILSCRC-2012, our method reduces more than 42 0.2 ResNet-50, our progressive pruning method have 1.08 over the pruning method without progressive pruning.

READ FULL TEXT
research
08/21/2018

Soft Filter Pruning for Accelerating Deep Convolutional Neural Networks

This paper proposed a Soft Filter Pruning (SFP) method to accelerate the...
research
05/28/2019

Online Filter Clustering and Pruning for Efficient Convnets

Pruning filters is an effective method for accelerating deep neural netw...
research
11/06/2020

GHFP: Gradually Hard Filter Pruning

Filter pruning is widely used to reduce the computation of deep learning...
research
01/26/2019

PruneTrain: Gradual Structured Pruning from Scratch for Faster Neural Network Training

Model pruning is a popular mechanism to make a network more efficient fo...
research
06/20/2019

An Improved Trade-off Between Accuracy and Complexity with Progressive Gradient Pruning

Although deep neural networks (NNs) have achieved state-of-the-art accur...
research
01/24/2020

Progressive Local Filter Pruning for Image Retrieval Acceleration

This paper focuses on network pruning for image retrieval acceleration. ...
research
09/28/2020

Kernel Based Progressive Distillation for Adder Neural Networks

Adder Neural Networks (ANNs) which only contain additions bring us a new...

Please sign up or login with your details

Forgot password? Click here to reset