Slimmable Neural Networks

12/21/2018
by   Jiahui Yu, et al.
16

We present a simple and general method to train a single neural network executable at different widths (number of channels in a layer), permitting instant and adaptive accuracy-efficiency trade-offs at runtime. Instead of training individual networks with different width configurations, we train a shared network with switchable batch normalization. At runtime, the network can adjust its width on the fly according to on-device benchmarks and resource constraints, rather than downloading and offloading different models. Our trained networks, named slimmable neural networks, achieve similar (and in many cases better) ImageNet classification accuracy than individually trained models of MobileNet v1, MobileNet v2, ShuffleNet and ResNet-50 at different widths respectively. We also demonstrate better performance of slimmable models compared with individual ones across a wide range of applications including COCO bounding-box object detection, instance segmentation and person keypoint detection without tuning hyper-parameters. Lastly we visualize and discuss the learned features of slimmable networks. Code and models are available at: https://github.com/JiahuiYu/slimmable_networks

READ FULL TEXT
research
03/12/2019

Universally Slimmable Networks and Improved Training Techniques

Slimmable networks are a family of neural networks that can instantly ad...
research
11/17/2019

Any-Precision Deep Neural Networks

We present Any-Precision Deep Neural Networks (Any-Precision DNNs), whic...
research
10/06/2021

ParaDiS: Parallelly Distributable Slimmable Neural Networks

When several limited power devices are available, one of the most effici...
research
02/13/2023

Stitchable Neural Networks

The public model zoo containing enormous powerful pretrained model famil...
research
03/27/2019

Network Slimming by Slimmable Networks: Towards One-Shot Architecture Search for Channel Numbers

We study how to set channel numbers in a neural network to achieve bette...
research
07/19/2020

Resolution Switchable Networks for Runtime Efficient Image Recognition

We propose a general method to train a single convolutional neural netwo...
research
06/26/2018

Limited Evaluation Evolutionary Optimization of Large Neural Networks

Stochastic gradient descent is the most prevalent algorithm to train neu...

Please sign up or login with your details

Forgot password? Click here to reset