Neural Architecture Search as Multiobjective Optimization Benchmarks: Problem Formulation and Performance Assessment

08/08/2022
by   Zhichao Lu, et al.
0

The ongoing advancements in network architecture design have led to remarkable achievements in deep learning across various challenging computer vision tasks. Meanwhile, the development of neural architecture search (NAS) has provided promising approaches to automating the design of network architectures for lower prediction error. Recently, the emerging application scenarios of deep learning have raised higher demands for network architectures considering multiple design criteria: number of parameters/floating-point operations, and inference latency, among others. From an optimization point of view, the NAS tasks involving multiple design criteria are intrinsically multiobjective optimization problems; hence, it is reasonable to adopt evolutionary multiobjective optimization (EMO) algorithms for tackling them. Nonetheless, there is still a clear gap confining the related research along this pathway: on the one hand, there is a lack of a general problem formulation of NAS tasks from an optimization point of view; on the other hand, there are challenges in conducting benchmark assessments of EMO algorithms on NAS tasks. To bridge the gap: (i) we formulate NAS tasks into general multi-objective optimization problems and analyze the complex characteristics from an optimization point of view; (ii) we present an end-to-end pipeline, dubbed $\texttt{EvoXBench}$, to generate benchmark test problems for EMO algorithms to run efficiently -- without the requirement of GPUs or Pytorch/Tensorflow; (iii) we instantiate two test suites comprehensively covering two datasets, seven search spaces, and three hardware devices, involving up to eight objectives. Based on the above, we validate the proposed test suites using six representative EMO algorithms and provide some empirical analyses. The code of $\texttt{EvoXBench}$ is available from $\href{https://github.com/EMI-Group/EvoXBench}{\rm{here}}$.

READ FULL TEXT

page 1

page 5

research
05/03/2021

Bag of Baselines for Multi-objective Joint Neural Architecture Search and Hyperparameter Optimization

Neural architecture search (NAS) and hyperparameter optimization (HPO) m...
research
03/08/2022

Evolutionary Neural Cascade Search across Supernetworks

To achieve excellent performance with modern neural networks, having the...
research
11/25/2020

aw_nas: A Modularized and Extensible NAS framework

Neural Architecture Search (NAS) has received extensive attention due to...
research
02/15/2022

Neural Architecture Search for Dense Prediction Tasks in Computer Vision

The success of deep learning in recent years has lead to a rising demand...
research
12/17/2020

On the performance of deep learning for numerical optimization: an application to protein structure prediction

Deep neural networks have recently drawn considerable attention to build...
research
08/14/2022

Surrogate-assisted Multi-objective Neural Architecture Search for Real-time Semantic Segmentation

The architectural advancements in deep neural networks have led to remar...
research
01/29/2023

EvoX: A Distributed GPU-accelerated Library towards Scalable Evolutionary Computation

During the past decades, evolutionary computation (EC) has demonstrated ...

Please sign up or login with your details

Forgot password? Click here to reset