Multi-objective Neural Architecture Search with Almost No Training

11/27/2020
by   Shengran Hu, et al.
0

In the recent past, neural architecture search (NAS) has attracted increasing attention from both academia and industries. Despite the steady stream of impressive empirical results, most existing NAS algorithms are computationally prohibitive to execute due to the costly iterations of stochastic gradient descent (SGD) training. In this work, we propose an effective alternative, dubbed Random-Weight Evaluation (RWE), to rapidly estimate the performance of network architectures. By just training the last linear classification layer, RWE reduces the computational cost of evaluating an architecture from hours to seconds. When integrated within an evolutionary multi-objective algorithm, RWE obtains a set of efficient architectures with state-of-the-art performance on CIFAR-10 with less than two hours' searching on a single GPU card. Ablation studies on rank-order correlations and transfer learning experiments to ImageNet have further validated the effectiveness of RWE.

READ FULL TEXT
10/08/2021

Accelerating Multi-Objective Neural Architecture Search by Random-Weight Evaluation

For the goal of automated design of high-performance deep convolutional ...
10/10/2019

Searching for A Robust Neural Architecture in Four GPU Hours

Conventional neural architecture search (NAS) approaches are based on re...
12/02/2018

ProxylessNAS: Direct Neural Architecture Search on Target Task and Hardware

Neural architecture search (NAS) has a great impact by automatically des...
07/08/2021

Core-set Sampling for Efficient Neural Architecture Search

Neural architecture search (NAS), an important branch of automatic machi...
07/03/2019

FairNAS: Rethinking Evaluation Fairness of Weight Sharing Neural Architecture Search

The ability to rank models by its real strength is the key to Neural Arc...
01/23/2020

Multi-objective Neural Architecture Search via Non-stationary Policy Gradient

Multi-objective Neural Architecture Search (NAS) aims to discover novel ...
07/18/2019

XferNAS: Transfer Neural Architecture Search

The term Neural Architecture Search (NAS) refers to the automatic optimi...