Neural Architecture Ranker

01/30/2022
by   Guo Bicheng, Brandon, et al.
17

Architecture ranking has recently been advocated to design an efficient and effective performance predictor for Neural Architecture Search (NAS). The previous contrastive method solves the ranking problem by comparing pairs of architectures and predicting their relative performance, which may suffer generalization issues due to local pair-wise comparison. Inspired by the quality stratification phenomenon in the search space, we propose a predictor, namely Neural Architecture Ranker (NAR), from a new and global perspective by exploiting the quality distribution of the whole search space. The NAR learns the similar characteristics of the same quality tier (i.e., level) and distinguishes among different individuals by first matching architectures with the representation of tiers, and then classifying and scoring them. It can capture the features of different quality tiers and thus generalize its ranking ability to the entire search space. Besides, distributions of different quality tiers are also beneficial to guide the sampling procedure, which is free of training a search algorithm and thus simplifies the NAS pipeline. The proposed NAR achieves better performance than the state-of-the-art methods on two widely accepted datasets. On NAS-Bench-101, it finds the architectures with top 0.01x2030 performance among the search space and stably focuses on the top architectures. On NAS-Bench-201, it identifies the optimal architectures on CIFAR-10, CIFAR-100 and, ImageNet-16-120. We expand and release these two datasets covering detailed cell computational information to boost the study of NAS.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/30/2019

ReNAS:Relativistic Evaluation of Neural Architecture Search

An effective and efficient architecture performance evaluation scheme is...
research
10/31/2022

Automatic Subspace Evoking for Efficient Neural Architecture Search

Neural Architecture Search (NAS) aims to automatically find effective ar...
research
02/21/2021

Weak NAS Predictors Are All You Need

Neural Architecture Search (NAS) finds the best network architecture by ...
research
11/30/2022

GENNAPE: Towards Generalized Neural Architecture Performance Estimators

Predicting neural architecture performance is a challenging task and is ...
research
12/30/2019

Searching for Stage-wise Neural Graphs In the Limit

Search space is a key consideration for neural architecture search. Rece...
research
02/12/2022

Evolving Neural Networks with Optimal Balance between Information Flow and Connections Cost

Evolving Neural Networks (NNs) has recently seen an increasing interest ...
research
05/07/2023

RATs-NAS: Redirection of Adjacent Trails on GCN for Neural Architecture Search

Various hand-designed CNN architectures have been developed, such as VGG...

Please sign up or login with your details

Forgot password? Click here to reset