RNAS: Architecture Ranking for Powerful Networks

09/30/2019
by   Yixing Xu, et al.
10

Neural Architecture Search (NAS) is attractive for automatically producing deep networks with excellent performance and acceptable computational costs. The performance of intermediate networks in most of existing NAS algorithms are usually represented by the results evaluated on a small proxy dataset with insufficient training in order to save computational resources and time. Although these representations could help us to distinct some searched architectures, they are still far away from the exact performance or ranking orders of all networks sampled from the given search space. Therefore, we propose to learn a performance predictor for ranking different models in the searching period using few networks pre-trained on the entire dataset. We represent each neural architecture as a feature tensor and use the predictor to further refining the representations of networks in the search space. The resulting performance predictor can be utilized for searching desired architectures without additional evaluation. Experimental results illustrate that, we can only use 0.1% (424 models) of the entire NASBench dataset to construct an accurate predictor for efficiently finding the architecture with 93.90% accuracy (0.04% top performance in the whole search space), which is about 0.5% higher than that of the state-of-the-art methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/30/2019

ReNAS:Relativistic Evaluation of Neural Architecture Search

An effective and efficient architecture performance evaluation scheme is...
research
05/29/2020

DC-NAS: Divide-and-Conquer Neural Architecture Search

Most applications demand high-performance deep neural architectures cost...
research
05/14/2020

A Semi-Supervised Assessor of Neural Architectures

Neural architecture search (NAS) aims to automatically design deep neura...
research
11/24/2020

Efficient Sampling for Predictor-Based Neural Architecture Search

Recently, predictor-based algorithms emerged as a promising approach for...
research
10/02/2022

Siamese-NAS: Using Trained Samples Efficiently to Find Lightweight Neural Architecture by Prior Knowledge

In the past decade, many architectures of convolution neural networks we...
research
04/17/2020

Fitting the Search Space of Weight-sharing NAS with Graph Convolutional Networks

Neural architecture search has attracted wide attentions in both academi...
research
06/06/2023

ColdNAS: Search to Modulate for User Cold-Start Recommendation

Making personalized recommendation for cold-start users, who only have a...

Please sign up or login with your details

Forgot password? Click here to reset