VACL: Variance-Aware Cross-Layer Regularization for Pruning Deep Residual Networks

09/10/2019
by   Shuang Gao, et al.
0

Improving weight sparsity is a common strategy for producing light-weight deep neural networks. However, pruning models with residual learning is more challenging. In this paper, we introduce Variance-Aware Cross-Layer (VACL), a novel approach to address this problem. VACL consists of two parts, a Cross-Layer grouping and a Variance Aware regularization. In Cross-Layer grouping the i^th filters of layers connected by skip-connections are grouped into one regularization group. Then, the Variance-Aware regularization term takes into account both the first and second-order statistics of the connected layers to constrain the variance within a group. Our approach can effectively improve the structural sparsity of residual models. For CIFAR10, the proposed method reduces a ResNet model by up to 79.5 and reduces a ResNeXt model by up to 82 ImageNet, it yields a pruned ratio of up to 63.3 accuracy drop. Our experimental results show that the proposed approach significantly outperforms other state-of-the-art methods in terms of overall model size and accuracy.

READ FULL TEXT
research
08/09/2019

Group Pruning using a Bounded-Lp norm for Group Gating and Regularization

Deep neural networks achieve state-of-the-art results on several tasks w...
research
05/28/2019

OICSR: Out-In-Channel Sparsity Regularization for Compact Deep Neural Networks

Channel pruning can significantly accelerate and compress deep neural ne...
research
02/03/2022

PRUNIX: Non-Ideality Aware Convolutional Neural Network Pruning for Memristive Accelerators

In this work, PRUNIX, a framework for training and pruning convolutional...
research
01/30/2023

DepGraph: Towards Any Structural Pruning

Structural pruning enables model acceleration by removing structurally-g...
research
11/02/2022

SIMD-size aware weight regularization for fast neural vocoding on CPU

This paper proposes weight regularization for a faster neural vocoder. P...
research
06/15/2022

Residual Sparsity Connection Learning for Efficient Video Super-Resolution

Lighter and faster models are crucial for the deployment of video super-...
research
06/10/2019

Network Implosion: Effective Model Compression for ResNets via Static Layer Pruning and Retraining

Residual Networks with convolutional layers are widely used in the field...

Please sign up or login with your details

Forgot password? Click here to reset