Fully Learnable Group Convolution for Acceleration of Deep Neural Networks

03/31/2019
by   Xijun Wang, et al.
0

Benefitted from its great success on many tasks, deep learning is increasingly used on low-computational-cost devices, e.g. smartphone, embedded devices, etc. To reduce the high computational and memory cost, in this work, we propose a fully learnable group convolution module (FLGC for short) which is quite efficient and can be embedded into any deep neural networks for acceleration. Specifically, our proposed method automatically learns the group structure in the training stage in a fully end-to-end manner, leading to a better structure than the existing pre-defined, two-steps, or iterative strategies. Moreover, our method can be further combined with depthwise separable convolution, resulting in 5 times acceleration than the vanilla Resnet50 on single CPU. An additional advantage is that in our FLGC the number of groups can be set as any value, but not necessarily 2^k as in most existing methods, meaning better tradeoff between accuracy and speed. As evaluated in our experiments, our method achieves better performance than existing learnable group convolution and standard group convolution when using the same number of groups.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/15/2023

Spatial-Spectral Hyperspectral Classification based on Learnable 3D Group Convolution

Deep neural networks have faced many problems in hyperspectral image cla...
research
10/11/2021

Two-level Group Convolution

Group convolution has been widely used in order to reduce the computatio...
research
06/09/2019

HGC: Hierarchical Group Convolution for Highly Efficient Neural Network

Group convolution works well with many deep convolutional neural network...
research
08/16/2019

Differentiable Learning-to-Group Channels via Groupable Convolutional Neural Networks

Group convolution, which divides the channels of ConvNets into groups, h...
research
01/13/2023

Learnable Heterogeneous Convolution: Learning both topology and strength

Existing convolution techniques in artificial neural networks suffer fro...
research
08/16/2019

Differentiable Learning-to-Group Channels viaGroupable Convolutional Neural Networks

Group convolution, which divides the channels of ConvNets into groups, h...
research
02/19/2020

Model-Agnostic Structured Sparsification with Learnable Channel Shuffle

Recent advances in convolutional neural networks (CNNs) usually come wit...

Please sign up or login with your details

Forgot password? Click here to reset