Building Efficient ConvNets using Redundant Feature Pruning

02/21/2018
by   Babajide O. Ayinde, et al.
0

This paper presents an efficient technique to prune deep and/or wide convolutional neural network models by eliminating redundant features (or filters). Previous studies have shown that over-sized deep neural network models tend to produce a lot of redundant features that are either shifted version of one another or are very similar and show little or no variations; thus resulting in filtering redundancy. We propose to prune these redundant features along with their connecting feature maps according to their differentiation and based on their relative cosine distances in the feature space, thus yielding smaller network size with reduced inference costs and competitive performance. We empirically show on select models and CIFAR-10 dataset that inference costs can be reduced by 40 ResNet-56, and 39

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/30/2019

On Correlation of Features Extracted by Deep Neural Networks

Redundancy in deep neural network (DNN) models has always been one of th...
research
12/10/2021

Network Compression via Central Filter

Neural network pruning has remarkable performance for reducing the compl...
research
06/22/2020

Split to Be Slim: An Overlooked Redundancy in Vanilla Convolution

Many effective solutions have been proposed to reduce the redundancy of ...
research
10/14/2022

Neural Network Compression by Joint Sparsity Promotion and Redundancy Reduction

Compression of convolutional neural network models has recently been dom...
research
01/26/2022

On The Energy Statistics of Feature Maps in Pruning of Neural Networks with Skip-Connections

We propose a new structured pruning framework for compressing Deep Neura...
research
04/17/2021

Towards Efficient Convolutional Network Models with Filter Distribution Templates

Increasing number of filters in deeper layers when feature maps are decr...
research
12/09/2020

Binding and Perspective Taking as Inference in a Generative Neural Network Model

The ability to flexibly bind features into coherent wholes from differen...

Please sign up or login with your details

Forgot password? Click here to reset