Prioritized Architecture Sampling with Monto-Carlo Tree Search

03/22/2021
by   Xiu Su, et al.
0

One-shot neural architecture search (NAS) methods significantly reduce the search cost by considering the whole search space as one network, which only needs to be trained once. However, current methods select each operation independently without considering previous layers. Besides, the historical information obtained with huge computation cost is usually used only once and then discarded. In this paper, we introduce a sampling strategy based on Monte Carlo tree search (MCTS) with the search space modeled as a Monte Carlo tree (MCT), which captures the dependency among layers. Furthermore, intermediate results are stored in the MCT for the future decision and a better exploration-exploitation balance. Concretely, MCT is updated using the training loss as a reward to the architecture performance; for accurately evaluating the numerous nodes, we propose node communication and hierarchical node selection methods in the training and search stages, respectively, which make better uses of the operation rewards and hierarchical information. Moreover, for a fair comparison of different NAS methods, we construct an open-source NAS benchmark of a macro search space evaluated on CIFAR-10, namely NAS-Bench-Macro. Extensive experiments on NAS-Bench-Macro and ImageNet demonstrate that our method significantly improves search efficiency and performance. For example, by only searching 20 architectures, our obtained architecture achieves 78.0% top-1 accuracy with 442M FLOPs on ImageNet. Code (Benchmark) is available at: <https://github.com/xiusu/NAS-Bench-Macro>.

READ FULL TEXT
research
10/13/2022

BLOX: Macro Neural Architecture Search Benchmark and Algorithms

Neural architecture search (NAS) has been successfully used to design nu...
research
01/16/2020

MixPath: A Unified Approach for One-shot Neural Architecture Search

The expressiveness of search space is a key concern in neural architectu...
research
06/20/2022

Shapley-NAS: Discovering Operation Contribution for Neural Architecture Search

In this paper, we propose a Shapley value based method to evaluate opera...
research
03/25/2022

Searching for Network Width with Bilaterally Coupled Network

Searching for a more compact network width recently serves as an effecti...
research
06/18/2019

Prune and Replace NAS

While recent NAS algorithms are thousands of times faster than the pione...
research
05/18/2018

AlphaX: eXploring Neural Architectures with Deep Neural Networks and Monte Carlo Tree Search

We present AlphaX, a fully automated agent that designs complex neural a...
research
11/24/2021

GreedyNASv2: Greedier Search with a Greedy Path Filter

Training a good supernet in one-shot NAS methods is difficult since the ...

Please sign up or login with your details

Forgot password? Click here to reset