Batch Normalization Tells You Which Filter is Important

12/02/2021
by   Junghun Oh, et al.
0

The goal of filter pruning is to search for unimportant filters to remove in order to make convolutional neural networks (CNNs) efficient without sacrificing the performance in the process. The challenge lies in finding information that can help determine how important or relevant each filter is with respect to the final output of neural networks. In this work, we share our observation that the batch normalization (BN) parameters of pre-trained CNNs can be used to estimate the feature distribution of activation outputs, without processing of training data. Upon observation, we propose a simple yet effective filter pruning method by evaluating the importance of each filter based on the BN parameters of pre-trained CNNs. The experimental results on CIFAR-10 and ImageNet demonstrate that the proposed method can achieve outstanding performance with and without fine-tuning in terms of the trade-off between the accuracy drop and the reduction in computational complexity and number of parameters of pruned networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/29/2022

A Passive Similarity based CNN Filter Pruning for Efficient Acoustic Scene Classification

We present a method to develop low-complexity convolutional neural netwo...
research
08/09/2022

SBPF: Sensitiveness Based Pruning Framework For Convolutional Neural Network On Image Classification

Pruning techniques are used comprehensively to compress convolutional ne...
research
08/08/2023

D-Score: A Synapse-Inspired Approach for Filter Pruning

This paper introduces a new aspect for determining the rank of the unimp...
research
05/06/2020

Dependency Aware Filter Pruning

Convolutional neural networks (CNNs) are typically over-parameterized, b...
research
05/13/2019

Implicit Filter Sparsification In Convolutional Neural Networks

We show implicit filter level sparsity manifests in convolutional neural...
research
08/13/2022

Entropy Induced Pruning Framework for Convolutional Neural Networks

Structured pruning techniques have achieved great compression performanc...
research
10/28/2022

LOFT: Finding Lottery Tickets through Filter-wise Training

Recent work on the Lottery Ticket Hypothesis (LTH) shows that there exis...

Please sign up or login with your details

Forgot password? Click here to reset