Channel Compression: Rethinking Information Redundancy among Channels in CNN Architecture

07/02/2020
by   Jinhua Liang, et al.
0

Model compression and acceleration are attracting increasing attentions due to the demand for embedded devices and mobile applications. Research on efficient convolutional neural networks (CNNs) aims at removing feature redundancy by decomposing or optimizing the convolutional calculation. In this work, feature redundancy is assumed to exist among channels in CNN architectures, which provides some leeway to boost calculation efficiency. Aiming at channel compression, a novel convolutional construction named compact convolution is proposed to embrace the progress in spatial convolution, channel grouping and pooling operation. Specifically, the depth-wise separable convolution and the point-wise interchannel operation are utilized to efficiently extract features. Different from the existing channel compression method which usually introduces considerable learnable weights, the proposed compact convolution can reduce feature redundancy with no extra parameters. With the point-wise interchannel operation, compact convolutions implicitly squeeze the channel dimension of feature maps. To explore the rules on reducing channel redundancy in neural networks, the comparison is made among different point-wise interchannel operations. Moreover, compact convolutions are extended to tackle with multiple tasks, such as acoustic scene classification, sound event detection and image classification. The extensive experiments demonstrate that our compact convolution not only exhibits high effectiveness in several multimedia tasks, but also can be efficiently implemented by benefiting from parallel computation.

READ FULL TEXT

page 1

page 7

research
11/01/2019

Comb Convolution for Efficient Convolutional Architecture

Convolutional neural networks (CNNs) are inherently suffering from massi...
research
09/05/2018

ChannelNets: Compact and Efficient Convolutional Neural Networks via Channel-Wise Convolutions

Convolutional neural networks (CNNs) have shown great capability of solv...
research
04/10/2019

Drop an Octave: Reducing Spatial Redundancy in Convolutional Neural Networks with Octave Convolution

In natural images, information is conveyed at different frequencies wher...
research
11/27/2019

Orthogonal Convolutional Neural Networks

The instability and feature redundancy in CNNs hinders further performan...
research
06/26/2023

Optimized Vectorizing of Building Structures with Swap: High-Efficiency Convolutional Channel-Swap Hybridization Strategy

The building planar graph reconstruction, a.k.a. footprint reconstructio...
research
10/18/2021

Channel redundancy and overlap in convolutional neural networks with channel-wise NNK graphs

Feature spaces in the deep layers of convolutional neural networks (CNNs...
research
09/22/2018

Shift-based Primitives for Efficient Convolutional Neural Networks

We propose a collection of three shift-based primitives for building eff...

Please sign up or login with your details

Forgot password? Click here to reset