Structured Network Pruning by Measuring Filter-wise Interactions

07/03/2023
by   Wenting Tang, et al.
0

Structured network pruning is a practical approach to reduce computation cost directly while retaining the CNNs' generalization performance in real applications. However, identifying redundant filters is a core problem in structured network pruning, and current redundancy criteria only focus on individual filters' attributes. When pruning sparsity increases, these redundancy criteria are not effective or efficient enough. Since the filter-wise interaction also contributes to the CNN's prediction accuracy, we integrate the filter-wise interaction into the redundancy criterion. In our criterion, we introduce the filter importance and filter utilization strength to reflect the decision ability of individual and multiple filters. Utilizing this new redundancy criterion, we propose a structured network pruning approach SNPFI (Structured Network Pruning by measuring Filter-wise Interaction). During the pruning, the SNPFI can automatically assign the proper sparsity based on the filter utilization strength and eliminate the useless filters by filter importance. After the pruning, the SNPFI can recover pruned model's performance effectively without iterative training by minimizing the interaction difference. We empirically demonstrate the effectiveness of the SNPFI with several commonly used CNN models, including AlexNet, MobileNetv1, and ResNet-50, on various image classification datasets, including MNIST, CIFAR-10, and ImageNet. For all experimental CNN models, nearly 60 reduced in a network compression while the classification accuracy remains.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/14/2021

SNF: Filter Pruning via Searching the Proper Number of Filters

Convolutional Neural Network (CNN) has an amount of parameter redundancy...
research
12/11/2018

A Main/Subsidiary Network Framework for Simplifying Binary Neural Network

To reduce memory footprint and run-time latency, techniques such as neur...
research
03/11/2022

Improve Convolutional Neural Network Pruning by Maximizing Filter Variety

Neural network pruning is a widely used strategy for reducing model stor...
research
02/26/2022

Symmetric Convolutional Filters: A Novel Way to Constrain Parameters in CNN

We propose a novel technique to constrain parameters in CNN based on sym...
research
06/16/2022

Asymptotic Soft Cluster Pruning for Deep Neural Networks

Filter pruning method introduces structural sparsity by removing selecte...
research
06/19/2017

An Entropy-based Pruning Method for CNN Compression

This paper aims to simultaneously accelerate and compress off-the-shelf ...
research
03/12/2020

SASL: Saliency-Adaptive Sparsity Learning for Neural Network Acceleration

Accelerating the inference speed of CNNs is critical to their deployment...

Please sign up or login with your details

Forgot password? Click here to reset