RankNAS: Efficient Neural Architecture Search by Pairwise Ranking

09/15/2021
by   Chi Hu, et al.
0

This paper addresses the efficiency challenge of Neural Architecture Search (NAS) by formulating the task as a ranking problem. Previous methods require numerous training examples to estimate the accurate performance of architectures, although the actual goal is to find the distinction between "good" and "bad" candidates. Here we do not resort to performance predictors. Instead, we propose a performance ranking method (RankNAS) via pairwise ranking. It enables efficient architecture search using much fewer training examples. Moreover, we develop an architecture selection method to prune the search space and concentrate on more promising candidates. Extensive experiments on machine translation and language modeling tasks show that RankNAS can design high-performance architectures while being orders of magnitude faster than state-of-the-art NAS systems.

READ FULL TEXT
research
06/24/2018

DARTS: Differentiable Architecture Search

This paper addresses the scalability challenge of architecture search by...
research
04/28/2020

Angle-based Search Space Shrinking for Neural Architecture Search

In this work, we present a simple and general search space shrinking met...
research
11/26/2019

Ranking architectures using meta-learning

Neural architecture search has recently attracted lots of research effor...
research
12/10/2019

Efficient Differentiable Neural Architecture Search with Meta Kernels

The searching procedure of neural architecture search (NAS) is notorious...
research
05/06/2020

Learning Architectures from an Extended Search Space for Language Modeling

Neural architecture search (NAS) has advanced significantly in recent ye...
research
02/27/2021

Neural Architecture Search From Task Similarity Measure

In this paper, we propose a neural architecture search framework based o...
research
08/06/2021

AceNAS: Learning to Rank Ace Neural Architectures with Weak Supervision of Weight Sharing

Architecture performance predictors have been widely used in neural arch...

Please sign up or login with your details

Forgot password? Click here to reset