Slimmable Pruned Neural Networks

12/07/2022
by   Hideaki Kuratsu, et al.
0

Slimmable Neural Networks (S-Net) is a novel network which enabled to select one of the predefined proportions of channels (sub-network) dynamically depending on the current computational resource availability. The accuracy of each sub-network on S-Net, however, is inferior to that of individually trained networks of the same size due to its difficulty of simultaneous optimization on different sub-networks. In this paper, we propose Slimmable Pruned Neural Networks (SP-Net), which has sub-network structures learned by pruning instead of adopting structures with the same proportion of channels in each layer (width multiplier) like S-Net, and we also propose new pruning procedures: multi-base pruning instead of one-shot or iterative pruning to realize high accuracy and huge training time saving. We also introduced slimmable channel sorting (scs) to achieve calculation as fast as S-Net and zero padding match (zpm) pruning to prune residual structure in more efficient way. SP-Net can be combined with any kind of channel pruning methods and does not require any complicated processing or time-consuming architecture search like NAS models. Compared with each sub-network of the same FLOPs on S-Net, SP-Net improves accuracy by 1.2-1.5 MobileNetV1, 1.4-3.1 outperform other SOTA pruning methods and are on par with various NAS models according to our experimental results on ImageNet. The code is available at https://github.com/hideakikuratsu/SP-Net.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/07/2020

DMCP: Differentiable Markov Channel Pruning for Neural Networks

Recent works imply that the channel pruning can be regarded as searching...
research
07/06/2020

EagleEye: Fast Sub-net Evaluation for Efficient Neural Network Pruning

Finding out the computational redundant part of a trained Deep Neural Ne...
research
04/06/2020

Network Adjustment: Channel Search Guided by FLOPs Utilization Ratio

Automatic designing computationally efficient neural networks has receiv...
research
11/12/2019

CALPA-NET: Channel-pruning-assisted Deep Residual Network for Steganalysis of Digital Images

Over the past few years, detection performance improvements of deep-lear...
research
09/12/2021

Prioritized Subnet Sampling for Resource-Adaptive Supernet Training

A resource-adaptive supernet adjusts its subnets for inference to fit th...
research
03/27/2019

Network Slimming by Slimmable Networks: Towards One-Shot Architecture Search for Channel Numbers

We study how to set channel numbers in a neural network to achieve bette...
research
11/01/2021

Arch-Net: Model Distillation for Architecture Agnostic Model Deployment

Vast requirement of computation power of Deep Neural Networks is a major...

Please sign up or login with your details

Forgot password? Click here to reset