FairNAS: Rethinking Evaluation Fairness of Weight Sharing Neural Architecture Search

07/03/2019
by   Xiangxiang Chu, et al.
0

The ability to rank models by its real strength is the key to Neural Architecture Search. Traditional approaches adopt an incomplete training for such purpose which is still very costly. One-shot methods are thus devised to cut the expense by reusing the same set of weights. However, it is uncertain whether shared weights are truly effective. It is also unclear if a picked model is better because of its vigorous representational power or simply because it is overtrained. In order to remove the suspicion, we propose a novel idea called Fair Neural Architecture Search (FairNAS), in which a strict fairness constraint is enforced for fair inheritance and training. In this way, our supernet exhibits nice convergence and very high training accuracy. The performance of any sampled model loaded with shared weights from the supernet strongly correlates with that of stand-alone counterpart when trained fully. This result dramatically improves the searching efficiency, with a multi-objective reinforced evolutionary search backend, our pipeline generated a new set of state-of-the-art architectures on ImageNet: FairNAS-A attains 75.34 74.69 with others. The models and their evaluation code are made publicly available online http://github.com/fairnas/FairNAS.

READ FULL TEXT
research
07/05/2023

Dynamical Isometry based Rigorous Fair Neural Architecture Search

Recently, the weight-sharing technique has significantly speeded up the ...
research
08/16/2019

ScarletNAS: Bridging the Gap Between Scalability and Fairness in Neural Architecture Search

One-shot neural architecture search features fast training of a supernet...
research
12/24/2019

BETANAS: BalancEd TrAining and selective drop for Neural Architecture Search

Automatic neural architecture search techniques are becoming increasingl...
research
11/27/2020

Multi-objective Neural Architecture Search with Almost No Training

In the recent past, neural architecture search (NAS) has attracted incre...
research
02/21/2019

Overcoming Multi-Model Forgetting

We identify a phenomenon, which we refer to as multi-model forgetting, t...
research
03/24/2020

BigNAS: Scaling Up Neural Architecture Search with Big Single-Stage Models

Neural architecture search (NAS) has shown promising results discovering...
research
10/29/2020

Cream of the Crop: Distilling Prioritized Paths For One-Shot Neural Architecture Search

One-shot weight sharing methods have recently drawn great attention in n...

Please sign up or login with your details

Forgot password? Click here to reset