Efficient Sampling for Predictor-Based Neural Architecture Search

11/24/2020
by   Lukas Mauch, et al.
0

Recently, predictor-based algorithms emerged as a promising approach for neural architecture search (NAS). For NAS, we typically have to calculate the validation accuracy of a large number of Deep Neural Networks (DNNs), what is computationally complex. Predictor-based NAS algorithms address this problem. They train a proxy model that can infer the validation accuracy of DNNs directly from their network structure. During optimization, the proxy can be used to narrow down the number of architectures for which the true validation accuracy must be computed, what makes predictor-based algorithms sample efficient. Usually, we compute the proxy for all DNNs in the network search space and pick those that maximize the proxy as candidates for optimization. However, that is intractable in practice, because the search spaces are often very large and contain billions of network architectures. The contributions of this paper are threefold: 1) We define a sample efficiency gain to compare different predictor-based NAS algorithms. 2) We conduct experiments on the NASBench-101 dataset and show that the sample efficiency of predictor-based algorithms decreases dramatically if the proxy is only computed for a subset of the search space. 3) We show that if we choose the subset of the search space on which the proxy is evaluated in a smart way, the sample efficiency of the original predictor-based algorithm that has access to the full search space can be regained. This is an important step to make predictor-based NAS algorithms useful, in practice.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/30/2019

ReNAS:Relativistic Evaluation of Neural Architecture Search

An effective and efficient architecture performance evaluation scheme is...
research
03/28/2020

NPENAS: Neural Predictor Guided Evolution for Neural Architecture Search

Neural architecture search (NAS) is a promising method for automatically...
research
09/30/2019

RNAS: Architecture Ranking for Powerful Networks

Neural Architecture Search (NAS) is attractive for automatically produci...
research
06/03/2022

A Survey on Surrogate-assisted Efficient Neural Architecture Search

Neural architecture search (NAS) has become increasingly popular in the ...
research
03/19/2021

GNAS: A Generalized Neural Network Architecture Search Framework

In practice, the problems encountered in training NAS (Neural Architectu...
research
08/18/2021

RANK-NOSH: Efficient Predictor-Based Architecture Search via Non-Uniform Successive Halving

Predictor-based algorithms have achieved remarkable performance in the N...
research
05/02/2023

Predict NAS Multi-Task by Stacking Ensemble Models using GP-NAS

Accurately predicting the performance of architecture with small sample ...

Please sign up or login with your details

Forgot password? Click here to reset