An Improved Trade-off Between Accuracy and Complexity with Progressive Gradient Pruning

06/20/2019
by   Le Thanh Nguyen-Meidine, et al.
0

Although deep neural networks (NNs) have achieved state-of-the-art accuracy in many visual recognition tasks ,the growing computational complexity and energy consumption of networks remains an issue, especially for applications on platforms with limited resources and requiring real-time processing. Channel pruning techniques have recently shown promising results for the compression of convolutional NNs (CNNs). However, these techniques can result in low accuracy and complex optimisations because some only prune after training CNNs, while others prune from scratch during training by integrating sparsity constraints or modifying the loss function. The progressive soft filter pruning technique provides greater training efficiency, but its soft pruning strategy does no thandle the backward pass which is needed for better optimization. In this paper, a new Progressive Gradient Pruning (PGP) technique is proposed for iterative channel pruning during training. It relies on a criterion that measures the change in channel weights that improves existing progressive pruning, and on an effective hard and soft pruning strategies to adapt momentum tensors during the backward propagation pass. Experimental results obtained after training various CNNs on the MNIST and CIFAR10 datasets indicate that the PGP technique canachieve a better tradeoff between classification accuracy and network (time and memory) complexity than state-of-the-art channel pruning techniques

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/04/2019

A Survey of Pruning Methods for Efficient Person Re-identification Across Domains

Recent years have witnessed a substantial increase in the deep learning ...
research
08/22/2018

Progressive Deep Neural Networks Acceleration via Soft Filter Pruning

This paper proposed a Progressive Soft Filter Pruning method (PSFP) to p...
research
04/01/2023

Progressive Channel-Shrinking Network

Currently, salience-based channel pruning makes continuous breakthroughs...
research
11/04/2022

Soft Masking for Cost-Constrained Channel Pruning

Structured channel pruning has been shown to significantly accelerate in...
research
09/17/2020

Holistic Filter Pruning for Efficient Deep Neural Networks

Deep neural networks (DNNs) are usually over-parameterized to increase t...
research
11/01/2021

Back to Basics: Efficient Network Compression via IMP

Network pruning is a widely used technique for effectively compressing D...
research
06/01/2018

Structurally Sparsified Backward Propagation for Faster Long Short-Term Memory Training

Exploiting sparsity enables hardware systems to run neural networks fast...

Please sign up or login with your details

Forgot password? Click here to reset