Feature Flow Regularization: Improving Structured Sparsity in Deep Neural Networks

06/05/2021
by   Yue Wu, et al.
0

Pruning is a model compression method that removes redundant parameters in deep neural networks (DNNs) while maintaining accuracy. Most available filter pruning methods require complex treatments such as iterative pruning, features statistics/ranking, or additional optimization designs in the training process. In this paper, we propose a simple and effective regularization strategy from a new perspective of evolution of features, which we call feature flow regularization (FFR), for improving structured sparsity and filter pruning in DNNs. Specifically, FFR imposes controls on the gradient and curvature of feature flow along the neural network, which implicitly increases the sparsity of the parameters. The principle behind FFR is that coherent and smooth evolution of features will lead to an efficient network that avoids redundant parameters. The high structured sparsity obtained from FFR enables us to prune filters effectively. Experiments with VGGNets, ResNets on CIFAR-10/100, and Tiny ImageNet datasets demonstrate that FFR can significantly improve both unstructured and structured sparsity. Our pruning results in terms of reduction of parameters and FLOPs are comparable to or even better than those of state-of-the-art pruning methods.

READ FULL TEXT

page 13

page 14

research
04/12/2020

A Unified DNN Weight Compression Framework Using Reweighted Optimization Methods

To address the large model size and intensive computation requirement of...
research
07/25/2022

Trainability Preserving Neural Structured Pruning

Several recent works empirically find finetuning learning rate is critic...
research
07/15/2021

Only Train Once: A One-Shot Neural Network Training And Pruning Framework

Structured pruning is a commonly used technique in deploying deep neural...
research
06/01/2020

Pruning via Iterative Ranking of Sensitivity Statistics

With the introduction of SNIP [arXiv:1810.02340v2], it has been demonstr...
research
10/21/2020

Adaptive Structured Sparse Network for Efficient CNNs with Feature Regularization

Neural networks have made great progress in pixel to pixel image process...
research
01/26/2022

On The Energy Statistics of Feature Maps in Pruning of Neural Networks with Skip-Connections

We propose a new structured pruning framework for compressing Deep Neura...
research
12/05/2021

Training Structured Neural Networks Through Manifold Identification and Variance Reduction

This paper proposes an algorithm (RMDA) for training neural networks (NN...

Please sign up or login with your details

Forgot password? Click here to reset