Evaluating the Search Phase of Neural Architecture Search

02/21/2019
by   Christian Sciuto, et al.
0

Neural Architecture Search (NAS) aims to facilitate the design of deep networks for new tasks. Existing techniques rely on two stages: searching over the architecture space and validating the best architecture. Evaluating NAS algorithms is currently solely done by comparing their results on the downstream task. While intuitive, this fails to explicitly evaluate the effectiveness of their search strategies. In this paper, we extend the NAS evaluation procedure to include the search phase. To this end, we compare the quality of the solutions obtained by NAS search policies with that of random architecture selection. We find that: (i) On average, the random policy outperforms state-of-the-art NAS algorithms; and (ii) The results and candidate rankings of NAS algorithms do not reflect the true performance of the candidate architectures. While our former finding illustrates the fact that the NAS search space has been sufficiently constrained so that random solutions yield good results, we trace the latter back to the weight sharing strategy used by state-of-the-art NAS methods. In contrast with common belief, weight sharing negatively impacts the training of good architectures, thus reducing the effectiveness of the search process. We believe that following our evaluation framework will be key to designing NAS strategies that truly discover superior architectures.

READ FULL TEXT
research
10/16/2020

How Does Supernet Help in Neural Architecture Search?

With the success of Neural Architecture Search (NAS), weight sharing, as...
research
04/27/2022

PRE-NAS: Predictor-assisted Evolutionary Neural Architecture Search

Neural architecture search (NAS) aims to automate architecture engineeri...
research
02/20/2019

Random Search and Reproducibility for Neural Architecture Search

Neural architecture search (NAS) is a promising research direction that ...
research
11/29/2019

Blockwisely Supervised Neural Architecture Search with Knowledge Distillation

Neural Architecture Search (NAS), aiming at automatically designing netw...
research
01/05/2020

EcoNAS: Finding Proxies for Economical Neural Architecture Search

Neural Architecture Search (NAS) achieves significant progress in many c...
research
09/27/2022

Towards Regression-Free Neural Networks for Diverse Compute Platforms

With the shift towards on-device deep learning, ensuring a consistent be...
research
11/22/2022

Accuracy Prediction for NAS Acceleration using Feature Selection and Extrapolation

Predicting the accuracy of candidate neural architectures is an importan...

Please sign up or login with your details

Forgot password? Click here to reset