Shrink-Perturb Improves Architecture Mixing during Population Based Training for Neural Architecture Search

07/28/2023
by   Alexander Chebykin, et al.
0

In this work, we show that simultaneously training and mixing neural networks is a promising way to conduct Neural Architecture Search (NAS). For hyperparameter optimization, reusing the partially trained weights allows for efficient search, as was previously demonstrated by the Population Based Training (PBT) algorithm. We propose PBT-NAS, an adaptation of PBT to NAS where architectures are improved during training by replacing poorly-performing networks in a population with the result of mixing well-performing ones and inheriting the weights using the shrink-perturb technique. After PBT-NAS terminates, the created networks can be directly used without retraining. PBT-NAS is highly parallelizable and effective: on challenging tasks (image generation and reinforcement learning) PBT-NAS achieves superior performance compared to baselines (random search and mutation-based PBT).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/16/2021

BN-NAS: Neural Architecture Search with Batch Normalization

We present BN-NAS, neural architecture search with Batch Normalization (...
research
06/08/2020

Revisiting the Train Loss: an Efficient Performance Estimator for Neural Architecture Search

Reliable yet efficient evaluation of generalisation performance of a pro...
research
02/20/2019

Random Search and Reproducibility for Neural Architecture Search

Neural architecture search (NAS) is a promising research direction that ...
research
06/01/2023

LLMatic: Neural Architecture Search via Large Language Models and Quality-Diversity Optimization

Large Language Models (LLMs) have emerged as powerful tools capable of a...
research
01/23/2023

Efficient Training Under Limited Resources

Training time budget and size of the dataset are among the factors affec...
research
10/04/2022

Toward Edge-Efficient Dense Predictions with Synergistic Multi-Task Neural Architecture Search

In this work, we propose a novel and scalable solution to address the ch...
research
11/23/2022

NAS-LID: Efficient Neural Architecture Search with Local Intrinsic Dimension

One-shot neural architecture search (NAS) substantially improves the sea...

Please sign up or login with your details

Forgot password? Click here to reset