Data Agnostic Filter Gating for Efficient Deep Networks

10/28/2020
by   Xiu Su, et al.
0

To deploy a well-trained CNN model on low-end computation edge devices, it is usually supposed to compress or prune the model under certain computation budget (e.g., FLOPs). Current filter pruning methods mainly leverage feature maps to generate important scores for filters and prune those with smaller scores, which ignores the variance of input batches to the difference in sparse structure over filters. In this paper, we propose a data agnostic filter pruning method that uses an auxiliary network named Dagger module to induce pruning and takes pretrained weights as input to learn the importance of each filter. In addition, to help prune filters with certain FLOPs constraints, we leverage an explicit FLOPs-aware regularization to directly promote pruning filters toward target FLOPs. Extensive experimental results on CIFAR-10 and ImageNet datasets indicate our superiority to other state-of-the-art filter pruning methods. For example, our 50% FLOPs ResNet-50 can achieve 76.1% Top-1 accuracy on ImageNet dataset, surpassing many other filter pruning methods.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset