Exploiting Operation Importance for Differentiable Neural Architecture Search

11/24/2019
by   Xukai Xie, et al.
0

Recently, differentiable neural architecture search methods significantly reduce the search cost by constructing a super network and relax the architecture representation by assigning architecture weights to the candidate operations. All the existing methods determine the importance of each operation directly by architecture weights. However, architecture weights cannot accurately reflect the importance of each operation; that is, the operation with the highest weight might not related to the best performance. To alleviate this deficiency, we propose a simple yet effective solution to neural architecture search, termed as exploiting operation importance for effective neural architecture search (EoiNAS), in which a new indicator is proposed to fully exploit the operation importance and guide the model search. Based on this new indicator, we propose a gradual operation pruning strategy to further improve the search efficiency and accuracy. Experimental results have demonstrated the effectiveness of the proposed method. Specifically, we achieve an error rate of 2.50% on CIFAR-10, which significantly outperforms state-of-the-art methods. When transferred to ImageNet, it achieves the top-1 error of 25.6%, comparable to the state-of-the-art performance under the mobile setting.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/24/2020

Disentangled Neural Architecture Search

Neural architecture search has shown its great potential in various area...
research
08/01/2022

Partial Connection Based on Channel Attention for Differentiable Neural Architecture Search

Differentiable neural architecture search (DARTS), as a gradient-guided ...
research
09/19/2020

Neural Architecture Search Using Stable Rank of Convolutional Layers

In Neural Architecture Search (NAS), Differentiable ARchiTecture Search ...
research
11/04/2020

DAIS: Automatic Channel Pruning via Differentiable Annealing Indicator Search

The convolutional neural network has achieved great success in fulfillin...
research
11/18/2020

Explicitly Learning Topology for Differentiable Neural Architecture Search

Differentiable neural architecture search (DARTS) has gained much succes...
research
10/23/2019

Efficient Decoupled Neural Architecture Search by Structure and Operation Sampling

We propose a novel neural architecture search algorithm via reinforcemen...
research
12/24/2019

BETANAS: BalancEd TrAining and selective drop for Neural Architecture Search

Automatic neural architecture search techniques are becoming increasingl...

Please sign up or login with your details

Forgot password? Click here to reset