Weight-Sharing Neural Architecture Search: A Battle to Shrink the Optimization Gap

08/04/2020
by   Lingxi Xie, et al.
41

Neural architecture search (NAS) has attracted increasing attentions in both academia and industry. In the early age, researchers mostly applied individual search methods which sample and evaluate the candidate architectures separately and thus incur heavy computational overheads. To alleviate the burden, weight-sharing methods were proposed in which exponentially many architectures share weights in the same super-network, and the costly training procedure is performed only once. These methods, though being much faster, often suffer the issue of instability. This paper provides a literature review on NAS, in particular the weight-sharing methods, and points out that the major challenge comes from the optimization gap between the super-network and the sub-architectures. From this perspective, we summarize existing approaches into several categories according to their efforts in bridging the gap, and analyze both advantages and disadvantages of these methodologies. Finally, we share our opinions on the future directions of NAS and AutoML. Due to the expertise of the authors, this paper mainly focuses on the application of NAS to computer vision problems and may bias towards the work in our group.

READ FULL TEXT
research
10/16/2020

How Does Supernet Help in Neural Architecture Search?

With the success of Neural Architecture Search (NAS), weight sharing, as...
research
04/17/2020

Fitting the Search Space of Weight-sharing NAS with Graph Convolutional Networks

Neural architecture search has attracted wide attentions in both academi...
research
03/29/2020

Disturbance-immune Weight Sharing for Neural Architecture Search

Neural architecture search (NAS) has gained increasing attention in the ...
research
02/11/2020

To Share or Not To Share: A Comprehensive Appraisal of Weight-Sharing

Weight-sharing (WS) has recently emerged as a paradigm to accelerate the...
research
03/09/2020

How to Train Your Super-Net: An Analysis of Training Heuristics in Weight-Sharing NAS

Weight sharing promises to make neural architecture search (NAS) tractab...
research
09/30/2022

IMB-NAS: Neural Architecture Search for Imbalanced Datasets

Class imbalance is a ubiquitous phenomenon occurring in real world data ...
research
10/04/2021

An Analysis of Super-Net Heuristics in Weight-Sharing NAS

Weight sharing promises to make neural architecture search (NAS) tractab...

Please sign up or login with your details

Forgot password? Click here to reset