Group Pruning using a Bounded-Lp norm for Group Gating and Regularization

08/09/2019
by   Chaithanya Kumar Mummadi, et al.
0

Deep neural networks achieve state-of-the-art results on several tasks while increasing in complexity. It has been shown that neural networks can be pruned during training by imposing sparsity inducing regularizers. In this paper, we investigate two techniques for group-wise pruning during training in order to improve network efficiency. We propose a gating factor after every convolutional layer to induce channel level sparsity, encouraging insignificant channels to become exactly zero. Further, we introduce and analyse a bounded variant of the L1 regularizer, which interpolates between L1 and L0-norms to retain performance of the network at higher pruning rates. To underline effectiveness of the proposed methods,we show that the number of parameters of ResNet-164, DenseNet-40 and MobileNetV2 can be reduced down by 30 on CIFAR100 respectively without a significant drop in accuracy. We achieve state-of-the-art pruning results for ResNet-50 with higher accuracy on ImageNet. Furthermore, we show that the light weight MobileNetV2 can further be compressed on ImageNet without a significant drop in performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/22/2020

PruneNet: Channel Pruning via Global Importance

Channel pruning is one of the predominant approaches for accelerating de...
research
09/10/2019

VACL: Variance-Aware Cross-Layer Regularization for Pruning Deep Residual Networks

Improving weight sparsity is a common strategy for producing light-weigh...
research
04/30/2021

Post-training deep neural network pruning via layer-wise calibration

We present a post-training weight pruning method for deep neural network...
research
08/27/2019

DeepHoyer: Learning Sparser Neural Network with Differentiable Scale-Invariant Sparsity Measures

In seeking for sparse and efficient neural network models, many previous...
research
12/17/2019

ℓ_0 Regularized Structured Sparsity Convolutional Neural Networks

Deepening and widening convolutional neural networks (CNNs) significantl...
research
07/16/2018

Backward Reduction of CNN Models with Information Flow Analysis

This paper proposes backward reduction, an algorithm that explores the c...
research
06/29/2019

Dissecting Pruned Neural Networks

Pruning is a standard technique for removing unnecessary structure from ...

Please sign up or login with your details

Forgot password? Click here to reset