Filter Pruning via Filters Similarity in Consecutive Layers

04/26/2023
by   Xiaorui Wang, et al.
0

Filter pruning is widely adopted to compress and accelerate the Convolutional Neural Networks (CNNs), but most previous works ignore the relationship between filters and channels in different layers. Processing each layer independently fails to utilize the collaborative relationship across layers. In this paper, we intuitively propose a novel pruning method by explicitly leveraging the Filters Similarity in Consecutive Layers (FSCL). FSCL compresses models by pruning filters whose corresponding features are more worthless in the model. The extensive experiments demonstrate the effectiveness of FSCL, and it yields remarkable improvement over state-of-the-art on accuracy, FLOPs and parameter reduction on several benchmark models and datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/21/2019

DBP: Discrimination Based Block-Level Pruning for Deep Model Acceleration

Neural network pruning is one of the most popular methods of acceleratin...
research
05/06/2020

Dependency Aware Filter Pruning

Convolutional neural networks (CNNs) are typically over-parameterized, b...
research
11/22/2019

Graph Pruning for Model Compression

Previous AutoML pruning works utilized individual layer features to auto...
research
07/27/2019

Learning Instance-wise Sparsity for Accelerating Deep Models

Exploring deep convolutional neural networks of high efficiency and low ...
research
11/03/2022

Self Similarity Matrix based CNN Filter Pruning

In recent years, most of the deep learning solutions are targeted to be ...
research
06/29/2022

Cut Inner Layers: A Structured Pruning Strategy for Efficient U-Net GANs

Pruning effectively compresses overparameterized models. Despite the suc...
research
05/21/2020

Feature Statistics Guided Efficient Filter Pruning

Building compact convolutional neural networks (CNNs) with reliable perf...

Please sign up or login with your details

Forgot password? Click here to reset