Multi-objective Neural Architecture Search with Almost No Training

11/27/2020
by   Shengran Hu, et al.
0

In the recent past, neural architecture search (NAS) has attracted increasing attention from both academia and industries. Despite the steady stream of impressive empirical results, most existing NAS algorithms are computationally prohibitive to execute due to the costly iterations of stochastic gradient descent (SGD) training. In this work, we propose an effective alternative, dubbed Random-Weight Evaluation (RWE), to rapidly estimate the performance of network architectures. By just training the last linear classification layer, RWE reduces the computational cost of evaluating an architecture from hours to seconds. When integrated within an evolutionary multi-objective algorithm, RWE obtains a set of efficient architectures with state-of-the-art performance on CIFAR-10 with less than two hours' searching on a single GPU card. Ablation studies on rank-order correlations and transfer learning experiments to ImageNet have further validated the effectiveness of RWE.

READ FULL TEXT
research
10/08/2021

Accelerating Multi-Objective Neural Architecture Search by Random-Weight Evaluation

For the goal of automated design of high-performance deep convolutional ...
research
10/10/2019

Searching for A Robust Neural Architecture in Four GPU Hours

Conventional neural architecture search (NAS) approaches are based on re...
research
12/02/2018

ProxylessNAS: Direct Neural Architecture Search on Target Task and Hardware

Neural architecture search (NAS) has a great impact by automatically des...
research
10/04/2022

Toward Edge-Efficient Dense Predictions with Synergistic Multi-Task Neural Architecture Search

In this work, we propose a novel and scalable solution to address the ch...
research
07/03/2019

FairNAS: Rethinking Evaluation Fairness of Weight Sharing Neural Architecture Search

The ability to rank models by its real strength is the key to Neural Arc...
research
01/23/2020

Multi-objective Neural Architecture Search via Non-stationary Policy Gradient

Multi-objective Neural Architecture Search (NAS) aims to discover novel ...
research
07/20/2020

NSGANetV2: Evolutionary Multi-Objective Surrogate-Assisted Neural Architecture Search

In this paper, we propose an efficient NAS algorithm for generating task...

Please sign up or login with your details

Forgot password? Click here to reset