Efficient Novelty-Driven Neural Architecture Search

07/22/2019
by   Miao Zhang, et al.
0

One-Shot Neural architecture search (NAS) attracts broad attention recently due to its capacity to reduce the computational hours through weight sharing. However, extensive experiments on several recent works show that there is no positive correlation between the validation accuracy with inherited weights from the supernet and the test accuracy after re-training for One-Shot NAS. Different from devising a controller to find the best performing architecture with inherited weights, this paper focuses on how to sample architectures to train the supernet to make it more predictive. A single-path supernet is adopted, where only a small part of weights are optimized in each step, to reduce the memory demand greatly. Furthermore, we abandon devising complicated reward based architecture sampling controller, and sample architectures to train supernet based on novelty search. An efficient novelty search method for NAS is devised in this paper, and extensive experiments demonstrate the effectiveness and efficiency of our novelty search based architecture sampling method. The best architecture obtained by our algorithm with the same search space achieves the state-of-the-art test error rate of 2.51% on CIFAR-10 with only 7.5 hours search time in a single GPU, and a validation perplexity of 60.02 and a test perplexity of 57.36 on PTB. We also transfer these search cell structures to larger datasets ImageNet and WikiText-2, respectively.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/03/2020

ADWPNAS: Architecture-Driven Weight Prediction for Neural Architecture Search

How to discover and evaluate the true strength of models quickly and acc...
research
12/23/2020

Evolving Neural Architecture Using One Shot Model

Neural Architecture Search (NAS) is emerging as a new research direction...
research
11/05/2018

You Only Search Once: Single Shot Neural Architecture Search via Direct Sparse Optimization

Recently Neural Architecture Search (NAS) has aroused great interest in ...
research
03/31/2019

Single Path One-Shot Neural Architecture Search with Uniform Sampling

One-shot method is a powerful Neural Architecture Search (NAS) framework...
research
09/27/2022

Towards Regression-Free Neural Networks for Diverse Compute Platforms

With the shift towards on-device deep learning, ensuring a consistent be...
research
03/31/2019

Understanding Neural Architecture Search Techniques

Automatic methods for generating state-of-the-art neural network archite...
research
11/23/2020

ROME: Robustifying Memory-Efficient NAS via Topology Disentanglement and Gradients Accumulation

Single-path based differentiable neural architecture search has great st...

Please sign up or login with your details

Forgot password? Click here to reset