Semi-Supervised Neural Architecture Search

02/24/2020
by   Renqian Luo, et al.
0

Neural architecture search (NAS) relies on a good controller to generate better architectures or predict the accuracy of given architectures. However, training the controller requires both abundant and high-quality pairs of architectures and their accuracy, while it is costly to evaluate an architecture and obtain its accuracy. In this paper, we propose SemiNAS, a semi-supervised NAS approach that leverages numerous unlabeled architectures (without evaluation and thus nearly no cost) to improve the controller. Specifically, SemiNAS 1) trains an initial controller with a small set of architecture-accuracy data pairs; 2) uses the trained controller to predict the accuracy of large amount of architectures (without evaluation); and 3) adds the generated data pairs to the original data to further improve the controller. SemiNAS has two advantages: 1) It reduces the computational cost under the same accuracy guarantee. 2) It achieves higher accuracy under the same computational cost. On NASBench-101 benchmark dataset, it discovers a top 0.01 computational cost compared with regularized evolution and gradient-based methods. On ImageNet, it achieves 24.2 setting) using 4 GPU-days for search. We further apply it to LJSpeech text to speech task and it achieves 97 setting and 15 improvements over the baseline respectively. Our code is available at https://github.com/renqianluo/SemiNAS.

READ FULL TEXT

page 12

page 13

research
08/10/2021

Accelerating Evolutionary Neural Architecture Search via Multi-Fidelity Evaluation

Evolutionary neural architecture search (ENAS) has recently received inc...
research
03/15/2021

Pretraining Neural Architecture Search Controllers with Locality-based Self-Supervised Learning

Neural architecture search (NAS) has fostered various fields of machine ...
research
11/26/2018

InstaNAS: Instance-aware Neural Architecture Search

Neural Architecture Search (NAS) aims at finding one "single" architectu...
research
06/08/2022

Towards Self-supervised and Weight-preserving Neural Architecture Search

Neural architecture search (NAS) algorithms save tremendous labor from h...
research
12/30/2019

Neural Architecture Search on Acoustic Scene Classification

Convolutional neural networks are widely adopted in Acoustic Scene Class...
research
03/27/2023

TOFA: Transfer-Once-for-All

Weight-sharing neural architecture search aims to optimize a configurabl...
research
04/29/2019

Progressive Differentiable Architecture Search: Bridging the Depth Gap between Search and Evaluation

Recently, differentiable search methods have made major progress in redu...

Please sign up or login with your details

Forgot password? Click here to reset