SlimConv: Reducing Channel Redundancy in Convolutional Neural Networks by Weights Flipping

03/16/2020
by   Jiaxiong Qiu, et al.
0

The channel redundancy in feature maps of convolutional neural networks (CNNs) results in the large consumption of memories and computational resources. In this work, we design a novel Slim Convolution (SlimConv) module to boost the performance of CNNs by reducing channel redundancies. Our SlimConv consists of three main steps: Reconstruct, Transform and Fuse, through which the features are splitted and reorganized in a more efficient way, such that the learned weights can be compressed effectively. In particular, the core of our model is a weight flipping operation which can largely improve the feature diversities, contributing to the performance crucially. Our SlimConv is a plug-and-play architectural unit which can be used to replace convolutional layers in CNNs directly. We validate the effectiveness of SlimConv by conducting comprehensive experiments on ImageNet, MS COCO2014, Pascal VOC2012 segmentation, and Pascal VOC2007 detection datasets. The experiments show that SlimConv-equipped models can achieve better performances consistently, less consumption of memory and computation resources than non-equipped conterparts. For example, the ResNet-101 fitted with SlimConv achieves 77.84 classification accuracy with 4.87 GFLOPs and 27.96M parameters on ImageNet, which shows almost 0.5 parameters reduced.

READ FULL TEXT

page 9

page 11

research
11/27/2019

GhostNet: More Features from Cheap Operations

Deploying convolutional neural networks (CNNs) on embedded devices is di...
research
04/10/2019

Drop an Octave: Reducing Spatial Redundancy in Convolutional Neural Networks with Octave Convolution

In natural images, information is conveyed at different frequencies wher...
research
07/25/2017

Improving Robustness of Feature Representations to Image Deformations using Powered Convolution in CNNs

In this work, we address the problem of improvement of robustness of fea...
research
12/18/2019

P-CapsNets: a General Form of Convolutional Neural Networks

We propose Pure CapsNets (P-CapsNets) which is a generation of normal CN...
research
12/09/2021

A New Measure of Model Redundancy for Compressed Convolutional Neural Networks

While recently many designs have been proposed to improve the model effi...
research
06/22/2020

Split to Be Slim: An Overlooked Redundancy in Vanilla Convolution

Many effective solutions have been proposed to reduce the redundancy of ...
research
10/22/2021

Recurrence along Depth: Deep Convolutional Neural Networks with Recurrent Layer Aggregation

This paper introduces a concept of layer aggregation to describe how inf...

Please sign up or login with your details

Forgot password? Click here to reset