clcNet: Improving the Efficiency of Convolutional Neural Network using Channel Local Convolutions

12/17/2017
by   Dong-Qing Zhang, et al.
0

Depthwise convolution and grouped convolution has been successfully applied to improve the efficiency of convolutional neural network (CNN). This paper generalizes these convolution models to a general type of convolution named channel local convolution (CLC), where an output channel of convolution only depends on a subset of its input channels. This concept extends the spatial locality property of convolution to the channel dimension, thereby offering new dimension for network design. A CLC kernel is characterized by its channel dependency graph (CDG), the graphical representation of the dependencies between the input and output channels. CDG can be utilized to facilitate the design of the convolution block created by stacking multiple CLC kernels. We present an example of such design named CLC block, a novel structure distinct from previous models with fewer parameters and less computational cost. Based upon the CLC blocks, a new convolutional neural network named clcNet is constructed. And the experiments on Imagenet-1K dataset show that the clcNet achieves significantly higher computational efficiency and fewer parameters compared to the state-of-the-art networks.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset