Deep Model Compression based on the Training History

01/30/2021
by   S. H. Shabbeer Basha, et al.
0

Deep Convolutional Neural Networks (DCNNs) have shown promising results in several visual recognition problems which motivated the researchers to propose popular architectures such as LeNet, AlexNet, VGGNet, ResNet, and many more. These architectures come at a cost of high computational complexity and parameter storage. To get rid of storage and computational complexity, deep model compression methods have been evolved. We propose a novel History Based Filter Pruning (HBFP) method that utilizes network training history for filter pruning. Specifically, we prune the redundant filters by observing similar patterns in the L1-norms of filters (absolute sum of weights) over the training epochs. We iteratively prune the redundant filters of a CNN in three steps. First, we train the model and select the filter pairs with redundant filters in each pair. Next, we optimize the network to increase the similarity between the filters in a pair. It facilitates us to prune one filter from each pair based on its importance without much information loss. Finally, we retrain the network to regain the performance, which is dropped due to filter pruning. We test our approach on popular architectures such as LeNet-5 on MNIST dataset and VGG-16, ResNet-56, and ResNet-110 on CIFAR-10 dataset. The proposed pruning method outperforms the state-of-the-art in terms of FLOPs reduction (floating-point operations) by 97.98 VGG-16, ResNet-56, and ResNet-110 models, respectively, while maintaining the less error rate.

READ FULL TEXT
research
02/22/2022

HRel: Filter Pruning based on High Relevance between Activation Maps and Class Labels

This paper proposes an Information Bottleneck theory based filter prunin...
research
11/26/2018

Leveraging Filter Correlations for Deep Model Compression

We present a filter correlation based model compression approach for dee...
research
11/03/2022

Self Similarity Matrix based CNN Filter Pruning

In recent years, most of the deep learning solutions are targeted to be ...
research
03/07/2023

Filter Pruning based on Information Capacity and Independence

Filter pruning has been widely used in the compression and acceleration ...
research
06/08/2020

Novel Adaptive Binary Search Strategy-First Hybrid Pyramid- and Clustering-Based CNN Filter Pruning Method without Parameters Setting

Pruning redundant filters in CNN models has received growing attention. ...
research
10/29/2022

A pruning method based on the dissimilarity of angle among channels and filters

Convolutional Neural Network (CNN) is more and more widely used in vario...
research
03/11/2022

Improve Convolutional Neural Network Pruning by Maximizing Filter Variety

Neural network pruning is a widely used strategy for reducing model stor...

Please sign up or login with your details

Forgot password? Click here to reset