ChannelNets: Compact and Efficient Convolutional Neural Networks via Channel-Wise Convolutions

09/05/2018
by   Hongyang Gao, et al.
0

Convolutional neural networks (CNNs) have shown great capability of solving various artificial intelligence tasks. However, the increasing model size has raised challenges in employing them in resource-limited applications. In this work, we propose to compress deep models by using channel-wise convolutions, which re- place dense connections among feature maps with sparse ones in CNNs. Based on this novel operation, we build light-weight CNNs known as ChannelNets. Channel- Nets use three instances of channel-wise convolutions; namely group channel-wise convolutions, depth-wise separable channel-wise convolutions, and the convolu- tional classification layer. Compared to prior CNNs designed for mobile devices, ChannelNets achieve a significant reduction in terms of the number of parameters and computational cost without loss in accuracy. Notably, our work represents the first attempt to compress the fully-connected classification layer, which usually accounts for about 25 in compact CNNs. Experimental results on the ImageNet dataset demonstrate that ChannelNets achieve consistently better performance compared to prior methods.

READ FULL TEXT
research
11/01/2019

Comb Convolution for Efficient Convolutional Architecture

Convolutional neural networks (CNNs) are inherently suffering from massi...
research
10/21/2019

Depth-wise Decomposition for Accelerating Separable Convolutions in Efficient Convolutional Neural Networks

Very deep convolutional neural networks (CNNs) have been firmly establis...
research
04/17/2017

MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications

We present a class of efficient models called MobileNets for mobile and ...
research
06/13/2017

SEP-Nets: Small and Effective Pattern Networks

While going deeper has been witnessed to improve the performance of conv...
research
11/25/2017

CondenseNet: An Efficient DenseNet using Learned Group Convolutions

Deep neural networks are increasingly used on mobile devices, where comp...
research
06/10/2023

FalconNet: Factorization for the Light-weight ConvNets

Designing light-weight CNN models with little parameters and Flops is a ...
research
07/02/2020

Channel Compression: Rethinking Information Redundancy among Channels in CNN Architecture

Model compression and acceleration are attracting increasing attentions ...

Please sign up or login with your details

Forgot password? Click here to reset