Few-shot Neural Architecture Search

06/11/2020
by   Yiyang Zhao, et al.
0

To improve the search efficiency for Neural Architecture Search (NAS), One-shot NAS proposes to train a single super-net to approximate the performance of proposal architectures during search via weight-sharing. While this greatly reduces the computation cost, due to approximation error, the performance prediction by a single super-net is less accurate than training each proposal architecture from scratch, leading to search inefficiency. In this work, we propose few-shot NAS that explores the choice of using multiple super-nets: each super-net is pre-trained to be in charge of a sub-region of the search space. This reduces the prediction error of each super-net. Moreover, training these super-nets can be done jointly via sequential fine-tuning. A natural choice of sub-region is to follow the splitting of search space in NAS. We empirically evaluate our approach on three different tasks in NAS-Bench-201. Extensive results have demonstrated that few-shot NAS, using only 5 super-nets, significantly improves performance of many search methods with slight increase of search time. The architectures found by DARTs and ENAS with few-shot models achieved 88.53 CIFAR-10 in NAS-Bench-201, significantly outperformed their one-shot counterparts (with 54.30 DARTS, few-shot NAS also outperforms previously state-of-the-art models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/13/2021

Towards One Shot Search Space Poisoning in Neural Architecture Search

We evaluate the robustness of a Neural Architecture Search (NAS) algorit...
research
03/29/2022

Generalizing Few-Shot NAS with Gradient Matching

Efficient performance estimation of architectures drawn from large searc...
research
05/19/2021

Efficient Transfer Learning via Joint Adaptation of Network Architecture and Weight

Transfer learning can boost the performance on the targettask by leverag...
research
12/20/2021

Enabling NAS with Automated Super-Network Generation

Recent Neural Architecture Search (NAS) solutions have produced impressi...
research
06/23/2019

One-Shot Neural Architecture Search Through A Posteriori Distribution Guided Sampling

The emergence of one-shot approaches has greatly advanced the research o...
research
04/23/2021

Inter-choice dependent super-network weights

The automatic design of architectures for neural networks, Neural Archit...
research
05/25/2023

Towards Automatic Neural Architecture Search within General Super-Networks

Existing neural architecture search (NAS) methods typically rely on pre-...

Please sign up or login with your details

Forgot password? Click here to reset