Fitting the Search Space of Weight-sharing NAS with Graph Convolutional Networks

04/17/2020
by   Xin Chen, et al.
6

Neural architecture search has attracted wide attentions in both academia and industry. To accelerate it, researchers proposed weight-sharing methods which first train a super-network to reuse computation among different operators, from which exponentially many sub-networks can be sampled and efficiently evaluated. These methods enjoy great advantages in terms of computational costs, but the sampled sub-networks are not guaranteed to be estimated precisely unless an individual training process is taken. This paper owes such inaccuracy to the inevitable mismatch between assembled network layers, so that there is a random error term added to each estimation. We alleviate this issue by training a graph convolutional network to fit the performance of sampled sub-networks so that the impact of random errors becomes minimal. With this strategy, we achieve a higher rank correlation coefficient in the selected set of candidates, which consequently leads to better performance of the final architecture. In addition, our approach also enjoys the flexibility of being used under different hardware constraints, since the graph convolutional network has provided an efficient lookup table of the performance of architectures in the entire search space.

READ FULL TEXT
research
08/04/2020

Weight-Sharing Neural Architecture Search: A Battle to Shrink the Optimization Gap

Neural architecture search (NAS) has attracted increasing attentions in ...
research
02/11/2020

To Share or Not To Share: A Comprehensive Appraisal of Weight-Sharing

Weight-sharing (WS) has recently emerged as a paradigm to accelerate the...
research
10/04/2021

An Analysis of Super-Net Heuristics in Weight-Sharing NAS

Weight sharing promises to make neural architecture search (NAS) tractab...
research
09/30/2019

RNAS: Architecture Ranking for Powerful Networks

Neural Architecture Search (NAS) is attractive for automatically produci...
research
02/16/2021

AlphaNet: Improved Training of Supernet with Alpha-Divergence

Weight-sharing neural architecture search (NAS) is an effective techniqu...
research
03/31/2019

Understanding Neural Architecture Search Techniques

Automatic methods for generating state-of-the-art neural network archite...
research
03/09/2020

How to Train Your Super-Net: An Analysis of Training Heuristics in Weight-Sharing NAS

Weight sharing promises to make neural architecture search (NAS) tractab...

Please sign up or login with your details

Forgot password? Click here to reset