Backward Reduction of CNN Models with Information Flow Analysis

07/16/2018
by   Yu-Hsun Lin, et al.
0

This paper proposes backward reduction, an algorithm that explores the compact CNN design from the information flow perspective. This algorithm can remove substantial non-zero weighting parameters (redundant neural channels) by considering the network dynamic behavior, which the traditional model compaction techniques cannot achieve, to reduce the size of a model. With the aid of our proposed algorithm, we achieve significant model reduction results of ResNet-34 in ImageNet scale (32.3 state-of-the-art result (10.8 SqueezeNet and MobileNet, we still achieve additional 10.81 reduction, respectively, with negligible performance degradation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/18/2018

MBS: Macroblock Scaling for CNN Model Reduction

We estimate the proper channel (width) scaling of Convolution Neural Net...
research
02/27/2020

Learning in the Frequency Domain

Deep neural networks have achieved remarkable success in computer vision...
research
08/09/2019

Group Pruning using a Bounded-Lp norm for Group Gating and Regularization

Deep neural networks achieve state-of-the-art results on several tasks w...
research
06/10/2020

Better Together: Resnet-50 accuracy with 13x fewer parameters and at 3x speed

Recent research on compressing deep neural networks has focused on reduc...
research
02/12/2018

Quasi-Optimal Partial Order Reduction

A dynamic partial order reduction (DPOR) algorithm is optimal when it al...
research
10/03/2022

Simple Pooling Front-ends For Efficient Audio Classification

Recently, there has been increasing interest in building efficient audio...
research
11/24/2020

MicroNet: Towards Image Recognition with Extremely Low FLOPs

In this paper, we present MicroNet, which is an efficient convolutional ...

Please sign up or login with your details

Forgot password? Click here to reset