Network Slimming by Slimmable Networks: Towards One-Shot Architecture Search for Channel Numbers

03/27/2019
by   Jiahui Yu, et al.
2

We study how to set channel numbers in a neural network to achieve better accuracy under constrained resources (e.g., FLOPs, latency, memory footprint or model size). A simple and one-shot solution, named AutoSlim, is presented. Instead of training many network samples and searching with reinforcement learning, we train a single slimmable network to approximate the network accuracy of different channel configurations. We then iteratively evaluate the trained slimmable model and greedily slim the layer with minimal accuracy drop. By this single pass, we can obtain the optimized channel configurations under different resource constraints. We present experiments with MobileNet v1, MobileNet v2, ResNet-50 and RL-searched MNasNet on ImageNet classification. We show significant improvements over their default channel configurations. We also achieve better accuracy than recent channel pruning methods and neural architecture search methods. Notably, by setting optimized channel numbers, our AutoSlim-MobileNet-v2 at 305M FLOPs achieves 74.2 (301M FLOPs), and even 0.2 AutoSlim-ResNet-50 at 570M FLOPs, without depthwise convolutions, achieves 1.3 better accuracy than MobileNet-v1 (569M FLOPs). Code and models will be available at: https://github.com/JiahuiYu/slimmable_networks

READ FULL TEXT
research
12/20/2019

AtomNAS: Fine-Grained End-to-End Neural Architecture Search

Designing of search space is a critical problem for neural architecture ...
research
10/15/2021

Joint Channel and Weight Pruning for Model Acceleration on Moblie Devices

For practical deep neural network design on mobile devices, it is essent...
research
12/07/2022

Slimmable Pruned Neural Networks

Slimmable Neural Networks (S-Net) is a novel network which enabled to se...
research
12/21/2018

Slimmable Neural Networks

We present a simple and general method to train a single neural network ...
research
05/20/2019

DARC: Differentiable ARchitecture Compression

In many learning situations, resources at inference time are significant...
research
01/13/2023

Adaptive Neural Networks Using Residual Fitting

Current methods for estimating the required neural-network size for a gi...
research
04/06/2020

Network Adjustment: Channel Search Guided by FLOPs Utilization Ratio

Automatic designing computationally efficient neural networks has receiv...

Please sign up or login with your details

Forgot password? Click here to reset