Learning Where To Look – Generative NAS is Surprisingly Efficient

03/16/2022
by   Jovita Lukasik, et al.
0

The efficient, automated search for well-performing neural architectures (NAS) has drawn increasing attention in the recent past. Thereby, the predominant research objective is to reduce the necessity of costly evaluations of neural architectures while efficiently exploring large search spaces. To this aim, surrogate models embed architectures in a latent space and predict their performance, while generative models for neural architectures enable optimization-based search within the latent space the generator draws from. Both, surrogate and generative models, have the aim of facilitating query-efficient search in a well-structured latent space. In this paper, we further improve the trade-off between query-efficiency and promising architecture generation by leveraging advantages from both, efficient surrogate models and generative design. To this end, we propose a generative model, paired with a surrogate predictor, that iteratively learns to generate samples from increasingly promising latent subspaces. This approach leads to very effective and efficient architecture search, while keeping the query amount low. In addition, our approach allows in a straightforward manner to jointly optimize for multiple objectives such as accuracy and hardware latency. We show the benefit of this approach not only w.r.t. the optimization of architectures for highest classification accuracy but also in the context of hardware constraints and outperform state-of-the art methods on several NAS benchmarks for single and multiple objectives. We also achieve state-of-the-art performance on ImageNet.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/28/2022

GraphPNAS: Learning Distribution of Good Neural Architectures via Deep Graph Generative Models

Neural architectures can be naturally viewed as computational graphs. Mo...
research
10/26/2022

PredNAS: A Universal and Sample Efficient Neural Architecture Search Framework

In this paper, we present a general and effective framework for Neural A...
research
06/12/2020

Does Unsupervised Architecture Representation Learning Help Neural Architecture Search?

Existing Neural Architecture Search (NAS) methods either encode neural a...
research
11/22/2020

FP-NAS: Fast Probabilistic Neural Architecture Search

Differential Neural Architecture Search (NAS) requires all layer choices...
research
04/05/2019

Single-Path NAS: Designing Hardware-Efficient ConvNets in less than 4 Hours

Can we automatically design a Convolutional Network (ConvNet) with the h...
research
06/18/2020

Neural Architecture Optimization with Graph VAE

Due to their high computational efficiency on a continuous space, gradie...
research
08/29/2018

Searching Toward Pareto-Optimal Device-Aware Neural Architectures

Recent breakthroughs in Neural Architectural Search (NAS) have achieved ...

Please sign up or login with your details

Forgot password? Click here to reset