Model-Agnostic Structured Sparsification with Learnable Channel Shuffle

02/19/2020
by   Xin-Yu Zhang, et al.
13

Recent advances in convolutional neural networks (CNNs) usually come with the expense of considerable computational overhead and memory footprint. Network compression aims to alleviate this issue by training compact models with comparable performance. However, existing compression techniques either entail dedicated expert design or compromise with a moderate performance drop. To this end, we propose a model-agnostic structured sparsification method for efficient network compression. The proposed method automatically induces structurally sparse representations of the convolutional weights, thereby facilitating the implementation of the compressed model with the highly-optimized group convolution. We further address the problem of inter-group communication with a learnable channel shuffle mechanism. The proposed approach is model-agnostic and highly compressible with a negligible performance drop. Extensive experimental results and analysis demonstrate that our approach performs favorably against the state-of-the-art network pruning methods. The code will be publicly available after the review process.

READ FULL TEXT

page 4

page 5

page 15

research
09/21/2020

Conditional Automated Channel Pruning for Deep Neural Networks

Model compression aims to reduce the redundancy of deep networks to obta...
research
06/09/2019

HGC: Hierarchical Group Convolution for Highly Efficient Neural Network

Group convolution works well with many deep convolutional neural network...
research
05/24/2021

Towards Compact CNNs via Collaborative Compression

Channel pruning and tensor decomposition have received extensive attenti...
research
12/29/2019

Pipelined Training with Stale Weights of Deep Convolutional Neural Networks

The growth in the complexity of Convolutional Neural Networks (CNNs) is ...
research
10/17/2021

Compression-aware Projection with Greedy Dimension Reduction for Convolutional Neural Network Activations

Convolutional neural networks (CNNs) achieve remarkable performance in a...
research
02/09/2022

Exploring Structural Sparsity in Neural Image Compression

Neural image compression have reached or out-performed traditional metho...
research
03/31/2019

Fully Learnable Group Convolution for Acceleration of Deep Neural Networks

Benefitted from its great success on many tasks, deep learning is increa...

Please sign up or login with your details

Forgot password? Click here to reset