AIP: Adversarial Iterative Pruning Based on Knowledge Transfer for Convolutional Neural Networks

08/31/2021
by   Jingfei Chang, et al.
8

With the increase of structure complexity, convolutional neural networks (CNNs) take a fair amount of computation cost. Meanwhile, existing research reveals the salient parameter redundancy in CNNs. The current pruning methods can compress CNNs with little performance drop, but when the pruning ratio increases, the accuracy loss is more serious. Moreover, some iterative pruning methods are difficult to accurately identify and delete unimportant parameters due to the accuracy drop during pruning. We propose a novel adversarial iterative pruning method (AIP) for CNNs based on knowledge transfer. The original network is regarded as the teacher while the compressed network is the student. We apply attention maps and output features to transfer information from the teacher to the student. Then, a shallow fully-connected network is designed as the discriminator to allow the output of two networks to play an adversarial game, thereby it can quickly recover the pruned accuracy among pruning intervals. Finally, an iterative pruning scheme based on the importance of channels is proposed. We conduct extensive experiments on the image classification tasks CIFAR-10, CIFAR-100, and ILSVRC-2012 to verify our pruning method can achieve efficient compression for CNNs even without accuracy loss. On the ILSVRC-2012, when removing 36.78 operations (FLOPs) of ResNet-18, the Top-1 accuracy drop are only 0.66 method is superior to some state-of-the-art pruning schemes in terms of compressing rate and accuracy. Moreover, we further demonstrate that AIP has good generalization on the object detection task PASCAL VOC.

READ FULL TEXT

page 4

page 5

page 7

page 8

page 9

page 11

page 14

page 15

research
01/16/2021

ACP: Automatic Channel Pruning via Clustering and Swarm Intelligence Optimization for CNN

As the convolutional neural network (CNN) gets deeper and wider in recen...
research
09/18/2019

Gate Decorator: Global Filter Pruning Method for Accelerating Deep Convolutional Neural Networks

Filter pruning is one of the most effective ways to accelerate and compr...
research
02/12/2020

Retrain or not retrain? – efficient pruning methods of deep CNN networks

Convolutional neural networks (CNN) play a major role in image processin...
research
10/21/2021

Class-Discriminative CNN Compression

Compressing convolutional neural networks (CNNs) by pruning and distilla...
research
11/06/2019

Localization-aware Channel Pruning for Object Detection

Channel pruning is one of the important methods for deep model compressi...
research
04/29/2020

Rethinking Class-Discrimination Based CNN Channel Pruning

Channel pruning has received ever-increasing focus on network compressio...
research
11/21/2018

Graph-Adaptive Pruning for Efficient Inference of Convolutional Neural Networks

In this work, we propose a graph-adaptive pruning (GAP) method for effic...

Please sign up or login with your details

Forgot password? Click here to reset