Accelerating Multi-Objective Neural Architecture Search by Random-Weight Evaluation

10/08/2021
by   Shengran Hu, et al.
0

For the goal of automated design of high-performance deep convolutional neural networks (CNNs), Neural Architecture Search (NAS) methodology is becoming increasingly important for both academia and industries.Due to the costly stochastic gradient descent (SGD) training of CNNs for performance evaluation, most existing NAS methods are computationally expensive for real-world deployments. To address this issue, we first introduce a new performance estimation metric, named Random-Weight Evaluation (RWE) to quantify the quality of CNNs in a cost-efficient manner. Instead of fully training the entire CNN, the RWE only trains its last layer and leaves the remainders with randomly initialized weights, which results in a single network evaluation in seconds.Second, a complexity metric is adopted for multi-objective NAS to balance the model size and performance. Overall, our proposed method obtains a set of efficient models with state-of-the-art performance in two real-world search spaces. Then the results obtained on the CIFAR-10 dataset are transferred to the ImageNet dataset to validate the practicality of the proposed algorithm. Moreover, ablation studies on NAS-Bench-301 datasets reveal the effectiveness of the proposed RWE in estimating the performance compared with existing methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/27/2020

Multi-objective Neural Architecture Search with Almost No Training

In the recent past, neural architecture search (NAS) has attracted incre...
research
06/27/2023

DCP-NAS: Discrepant Child-Parent Neural Architecture Search for 1-bit CNNs

Neural architecture search (NAS) proves to be among the effective approa...
research
04/23/2023

LayerNAS: Neural Architecture Search in Polynomial Complexity

Neural Architecture Search (NAS) has become a popular method for discove...
research
01/28/2021

Evolutionary Neural Architecture Search Supporting Approximate Multipliers

There is a growing interest in automated neural architecture search (NAS...
research
08/03/2020

Evolving Multi-Resolution Pooling CNN for Monaural Singing Voice Separation

Monaural Singing Voice Separation (MSVS) is a challenging task and has b...
research
10/16/2021

GradSign: Model Performance Inference with Theoretical Insights

A key challenge in neural architecture search (NAS) is quickly inferring...
research
09/14/2020

RelativeNAS: Relative Neural Architecture Search via Slow-Fast Learning

Despite the remarkable successes of Convolutional Neural Networks (CNNs)...

Please sign up or login with your details

Forgot password? Click here to reset