Universally Slimmable Networks and Improved Training Techniques

03/12/2019
by   Jiahui Yu, et al.
6

Slimmable networks are a family of neural networks that can instantly adjust the runtime width. The width can be chosen from a predefined widths set to adaptively optimize accuracy-efficiency trade-offs at runtime. In this work, we propose a systematic approach to train universally slimmable networks (US-Nets), extending slimmable networks to execute at arbitrary width, and generalizing to networks both with and without batch normalization layers. We further propose two improved training techniques for US-Nets, named the sandwich rule and inplace distillation, to enhance training process and boost testing accuracy. We show improved performance of universally slimmable MobileNet v1 and MobileNet v2 on ImageNet classification task, compared with individually trained ones and 4-switch slimmable network baselines. We also evaluate the proposed US-Nets and improved training techniques on tasks of image super-resolution and deep reinforcement learning. Extensive ablation experiments on these representative tasks demonstrate the effectiveness of our proposed methods. Our discovery opens up the possibility to directly evaluate FLOPs-Accuracy spectrum of network architectures. Code and models will be available at: https://github.com/JiahuiYu/slimmable_networks

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/21/2018

Slimmable Neural Networks

We present a simple and general method to train a single neural network ...
research
07/19/2020

Resolution Switchable Networks for Runtime Efficient Image Recognition

We propose a general method to train a single convolutional neural netwo...
research
07/13/2020

Learning to Learn Parameterized Classification Networks for Scalable Input Images

Convolutional Neural Networks (CNNs) do not have a predictable recogniti...
research
11/30/2020

SplitNet: Divide and Co-training

The width of a neural network matters since increasing the width will ne...
research
11/17/2019

Any-Precision Deep Neural Networks

We present Any-Precision Deep Neural Networks (Any-Precision DNNs), whic...
research
01/18/2023

Local Learning with Neuron Groups

Traditional deep network training methods optimize a monolithic objectiv...
research
08/07/2018

Quantized Densely Connected U-Nets for Efficient Landmark Localization

In this paper, we propose quantized densely connected U-Nets for efficie...

Please sign up or login with your details

Forgot password? Click here to reset