Tackling Neural Architecture Search With Quality Diversity Optimization

07/30/2022
by   Lennart Schneider, et al.
0

Neural architecture search (NAS) has been studied extensively and has grown to become a research field with substantial impact. While classical single-objective NAS searches for the architecture with the best performance, multi-objective NAS considers multiple objectives that should be optimized simultaneously, e.g., minimizing resource usage along the validation error. Although considerable progress has been made in the field of multi-objective NAS, we argue that there is some discrepancy between the actual optimization problem of practical interest and the optimization problem that multi-objective NAS tries to solve. We resolve this discrepancy by formulating the multi-objective NAS problem as a quality diversity optimization (QDO) problem and introduce three quality diversity NAS optimizers (two of them belonging to the group of multifidelity optimizers), which search for high-performing yet diverse architectures that are optimal for application-specific niches, e.g., hardware constraints. By comparing these optimizers to their multi-objective counterparts, we demonstrate that quality diversity NAS in general outperforms multi-objective NAS with respect to quality of solutions and efficiency. We further show how applications and future NAS research can thrive on QDO.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/03/2021

Bag of Baselines for Multi-objective Joint Neural Architecture Search and Hyperparameter Optimization

Neural architecture search (NAS) and hyperparameter optimization (HPO) m...
research
11/19/2020

Effective, Efficient and Robust Neural Architecture Search

Recent advances in adversarial attacks show the vulnerability of deep ne...
research
07/18/2023

A Survey on Multi-Objective Neural Architecture Search

Recently, the expert-crafted neural architectures is increasing overtake...
research
04/23/2023

LayerNAS: Neural Architecture Search in Polynomial Complexity

Neural Architecture Search (NAS) has become a popular method for discove...
research
03/23/2023

OFA^2: A Multi-Objective Perspective for the Once-for-All Neural Architecture Search

Once-for-All (OFA) is a Neural Architecture Search (NAS) framework desig...
research
05/08/2023

MO-DEHB: Evolutionary-based Hyperband for Multi-Objective Optimization

Hyperparameter optimization (HPO) is a powerful technique for automating...
research
06/01/2023

LLMatic: Neural Architecture Search via Large Language Models and Quality-Diversity Optimization

Large Language Models (LLMs) have emerged as powerful tools capable of a...

Please sign up or login with your details

Forgot password? Click here to reset