Network Adjustment: Channel Search Guided by FLOPs Utilization Ratio

04/06/2020
by   Zhengsu Chen, et al.
0

Automatic designing computationally efficient neural networks has received much attention in recent years. Existing approaches either utilize network pruning or leverage the network architecture search methods. This paper presents a new framework named network adjustment, which considers network accuracy as a function of FLOPs, so that under each network configuration, one can estimate the FLOPs utilization ratio (FUR) for each layer and use it to determine whether to increase or decrease the number of channels on the layer. Note that FUR, like the gradient of a non-linear function, is accurate only in a small neighborhood of the current network. Hence, we design an iterative mechanism so that the initial network undergoes a number of steps, each of which has a small `adjusting rate' to control the changes to the network. The computational overhead of the entire search process is reasonable, i.e., comparable to that of re-training the final model from scratch. Experiments on standard image classification datasets and a wide range of base networks demonstrate the effectiveness of our approach, which consistently outperforms the pruning counterpart. The code is available at https://github.com/danczs/NetworkAdjustment.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/07/2022

Slimmable Pruned Neural Networks

Slimmable Neural Networks (S-Net) is a novel network which enabled to se...
research
10/24/2021

Exploring Gradient Flow Based Saliency for DNN Model Compression

Model pruning aims to reduce the deep neural network (DNN) model size or...
research
04/22/2019

Towards Learning of Filter-Level Heterogeneous Compression of Convolutional Neural Networks

Recently, deep learning has become a de facto standard in machine learni...
research
06/29/2020

The Heterogeneity Hypothesis: Finding Layer-Wise Dissimilated Network Architecture

In this paper, we tackle the problem of convolutional neural network des...
research
03/27/2019

Network Slimming by Slimmable Networks: Towards One-Shot Architecture Search for Channel Numbers

We study how to set channel numbers in a neural network to achieve bette...
research
10/22/2020

AutoPruning for Deep Neural Network with Dynamic Channel Masking

Modern deep neural network models are large and computationally intensiv...
research
10/03/2021

Distributed Optimization using Heterogeneous Compute Systems

Hardware compute power has been growing at an unprecedented rate in rece...

Please sign up or login with your details

Forgot password? Click here to reset