Nonconvex Regularization for Network Slimming:Compressing CNNs Even More

10/03/2020
by   Kevin Bui, et al.
6

In the last decade, convolutional neural networks (CNNs) have evolved to become the dominant models for various computer vision tasks, but they cannot be deployed in low-memory devices due to its high memory requirement and computational cost. One popular, straightforward approach to compressing CNNs is network slimming, which imposes an ℓ_1 penalty on the channel-associated scaling factors in the batch normalization layers during training. In this way, channels with low scaling factors are identified to be insignificant and are pruned in the models. In this paper, we propose replacing the ℓ_1 penalty with the ℓ_p and transformed ℓ_1 (Tℓ_1) penalties since these nonconvex penalties outperformed ℓ_1 in yielding sparser satisfactory solutions in various compressed sensing problems. In our numerical experiments, we demonstrate network slimming with ℓ_p and Tℓ_1 penalties on VGGNet and Densenet trained on CIFAR 10/100. The results demonstrate that the nonconvex penalties compress CNNs better than ℓ_1. In addition, Tℓ_1 preserves the model accuracy after channel pruning, and ℓ_1/2, 3/4 yield compressed models with similar accuracies as ℓ_1 after retraining.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/02/2023

A Proximal Algorithm for Network Slimming

As a popular channel pruning method for convolutional neural networks (C...
research
05/13/2021

BWCP: Probabilistic Learning-to-Prune Channels for ConvNets via Batch Whitening

This work presents a probabilistic channel pruning method to accelerate ...
research
09/25/2019

Accurate and Compact Convolutional Neural Networks with Trained Binarization

Although convolutional neural networks (CNNs) are now widely used in var...
research
09/20/2023

CNNs for JPEGs: A Study in Computational Cost

Convolutional neural networks (CNNs) have achieved astonishing advances ...
research
06/07/2017

Network Sketching: Exploiting Binary Structure in Deep CNNs

Convolutional neural networks (CNNs) with deep architectures have substa...
research
08/21/2020

Training Sparse Neural Networks using Compressed Sensing

Pruning the weights of neural networks is an effective and widely-used t...
research
08/29/2019

Smaller Models, Better Generalization

Reducing network complexity has been a major research focus in recent ye...

Please sign up or login with your details

Forgot password? Click here to reset