Asymptotic Soft Cluster Pruning for Deep Neural Networks

06/16/2022
by   Tao Niu, et al.
0

Filter pruning method introduces structural sparsity by removing selected filters and is thus particularly effective for reducing complexity. Previous works empirically prune networks from the point of view that filter with smaller norm contributes less to the final results. However, such criteria has been proven sensitive to the distribution of filters, and the accuracy may hard to recover since the capacity gap is fixed once pruned. In this paper, we propose a novel filter pruning method called Asymptotic Soft Cluster Pruning (ASCP), to identify the redundancy of network based on the similarity of filters. Each filter from over-parameterized network is first distinguished by clustering, and then reconstructed to manually introduce redundancy into it. Several guidelines of clustering are proposed to better preserve feature extraction ability. After reconstruction, filters are allowed to be updated to eliminate the effect caused by mistakenly selected. Besides, various decaying strategies of the pruning rate are adopted to stabilize the pruning process and improve the final performance as well. By gradually generating more identical filters within each cluster, ASCP can remove them through channel addition operation with almost no accuracy drop. Extensive experiments on CIFAR-10 and ImageNet datasets show that our method can achieve competitive results compared with many state-of-the-art algorithms.

READ FULL TEXT

page 3

page 4

page 6

page 7

page 8

page 9

page 10

page 11

research
12/14/2021

SNF: Filter Pruning via Searching the Proper Number of Filters

Convolutional Neural Network (CNN) has an amount of parameter redundancy...
research
05/28/2019

Online Filter Clustering and Pruning for Efficient Convnets

Pruning filters is an effective method for accelerating deep neural netw...
research
07/03/2023

Structured Network Pruning by Measuring Filter-wise Interactions

Structured network pruning is a practical approach to reduce computation...
research
07/14/2020

REPrune: Filter Pruning via Representative Election

Even though norm-based filter pruning methods are widely accepted, it is...
research
07/30/2021

Manipulating Identical Filter Redundancy for Efficient Pruning on Deep and Complicated CNN

The existence of redundancy in Convolutional Neural Networks (CNNs) enab...
research
11/06/2020

GHFP: Gradually Hard Filter Pruning

Filter pruning is widely used to reduce the computation of deep learning...
research
09/08/2022

CAP: instance complexity-aware network pruning

Existing differentiable channel pruning methods often attach scaling fac...

Please sign up or login with your details

Forgot password? Click here to reset