A pruning method based on the dissimilarity of angle among channels and filters

10/29/2022
by   Jiayi Yao, et al.
0

Convolutional Neural Network (CNN) is more and more widely used in various fileds, and its computation and memory-demand are also increasing significantly. In order to make it applicable to limited conditions such as embedded application, network compression comes out. Among them, researchers pay more attention to network pruning. In this paper, we encode the convolution network to obtain the similarity of different encoding nodes, and evaluate the connectivity-power among convolutional kernels on the basis of the similarity. Then impose different level of penalty according to different connectivity-power. Meanwhile, we propose Channel Pruning base on the Dissimilarity of Angle (DACP). Firstly, we train a sparse model by GL penalty, and impose an angle dissimilarity constraint on the channels and filters of convolutional network to obtain a more sparse structure. Eventually, the effectiveness of our method is demonstrated in the section of experiment. On CIFAR-10, we reduce 66.86 where FLOPs represents the number of floating-point operations per second of the model. Moreover, on ResNet-32, we reduce FLOPs by 58.46 accuracy after pruning reach 91.76

READ FULL TEXT
research
06/18/2019

A One-step Pruning-recovery Framework for Acceleration of Convolutional Neural Networks

Acceleration of convolutional neural network has received increasing att...
research
01/30/2021

Deep Model Compression based on the Training History

Deep Convolutional Neural Networks (DCNNs) have shown promising results ...
research
11/18/2021

Automatic Neural Network Pruning that Efficiently Preserves the Model Accuracy

Neural networks performance has been significantly improved in the last ...
research
11/04/2020

Filter Pruning using Hierarchical Group Sparse Regularization for Deep Convolutional Neural Networks

Since the convolutional neural networks are often trained with redundant...
research
03/07/2023

Filter Pruning based on Information Capacity and Independence

Filter pruning has been widely used in the compression and acceleration ...
research
01/22/2020

Pruning CNN's with linear filter ensembles

Despite the promising results of convolutional neural networks (CNNs), a...
research
02/15/2022

Convolutional Network Fabric Pruning With Label Noise

This paper presents an iterative pruning strategy for Convolutional Netw...

Please sign up or login with your details

Forgot password? Click here to reset