A novel channel pruning method for deep neural network compression

05/29/2018
by   Yiming Hu, et al.
0

In recent years, deep neural networks have achieved great success in the field of computer vision. However, it is still a big challenge to deploy these deep models on resource-constrained embedded devices such as mobile robots, smart phones and so on. Therefore, network compression for such platforms is a reasonable solution to reduce memory consumption and computation complexity. In this paper, a novel channel pruning method based on genetic algorithm is proposed to compress very deep Convolution Neural Networks (CNNs). Firstly, a pre-trained CNN model is pruned layer by layer according to the sensitivity of each layer. After that, the pruned model is fine-tuned based on knowledge distillation framework. These two improvements significantly decrease the model redundancy with less accuracy drop. Channel selection is a combinatorial optimization problem that has exponential solution space. In order to accelerate the selection process, the proposed method formulates it as a search problem, which can be solved efficiently by genetic algorithm. Meanwhile, a two-step approximation fitness function is designed to further improve the efficiency of genetic process. The proposed method has been verified on three benchmark datasets with two popular CNN models: VGGNet and ResNet. On the CIFAR-100 and ImageNet datasets, our approach outperforms several state-of-the-art methods. On the CIFAR-10 and SVHN datasets, the pruned VGGNet achieves better performance than the original model with 8 times parameters compression and 3 times FLOPs reduction.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/21/2020

Conditional Automated Channel Pruning for Deep Neural Networks

Model compression aims to reduce the redundancy of deep networks to obta...
research
02/21/2022

DGAFF: Deep Genetic Algorithm Fitness Formation for EEG Bio-Signal Channel Selection

Brain-computer interface systems aim to facilitate human-computer intera...
research
01/13/2022

Recursive Least Squares for Training and Pruning Convolutional Neural Networks

Convolutional neural networks (CNNs) have succeeded in many practical ap...
research
01/16/2021

ACP: Automatic Channel Pruning via Clustering and Swarm Intelligence Optimization for CNN

As the convolutional neural network (CNN) gets deeper and wider in recen...
research
09/21/2019

Positive-Unlabeled Compression on the Cloud

Many attempts have been done to extend the great success of convolutiona...
research
07/19/2017

Channel Pruning for Accelerating Very Deep Neural Networks

In this paper, we introduce a new channel pruning method to accelerate v...
research
09/18/2018

MBS: Macroblock Scaling for CNN Model Reduction

We estimate the proper channel (width) scaling of Convolution Neural Net...

Please sign up or login with your details

Forgot password? Click here to reset