GraphPNAS: Learning Distribution of Good Neural Architectures via Deep Graph Generative Models

11/28/2022
by   Muchen Li, et al.
0

Neural architectures can be naturally viewed as computational graphs. Motivated by this perspective, we, in this paper, study neural architecture search (NAS) through the lens of learning random graph models. In contrast to existing NAS methods which largely focus on searching for a single best architecture, i.e, point estimation, we propose GraphPNAS a deep graph generative model that learns a distribution of well-performing architectures. Relying on graph neural networks (GNNs), our GraphPNAS can better capture topologies of good neural architectures and relations between operators therein. Moreover, our graph generator leads to a learnable probabilistic search method that is more flexible and efficient than the commonly used RNN generator and random search methods. Finally, we learn our generator via an efficient reinforcement learning formulation for NAS. To assess the effectiveness of our GraphPNAS, we conduct extensive experiments on three search spaces, including the challenging RandWire on TinyImageNet, ENAS on CIFAR10, and NAS-Bench-101/201. The complexity of RandWire is significantly larger than other search spaces in the literature. We show that our proposed graph generator consistently outperforms RNN-based one and achieves better or comparable performances than state-of-the-art NAS methods.

READ FULL TEXT
research
05/19/2021

Generative Adversarial Neural Architecture Search

Despite the empirical success of neural architecture search (NAS) in dee...
research
03/16/2022

Learning Where To Look – Generative NAS is Surprisingly Efficient

The efficient, automated search for well-performing neural architectures...
research
05/26/2023

DiffusionNAG: Task-guided Neural Architecture Generation with Diffusion Models

Neural Architecture Search (NAS) has emerged as a powerful technique for...
research
09/20/2023

Grassroots Operator Search for Model Edge Adaptation

Hardware-aware Neural Architecture Search (HW-NAS) is increasingly being...
research
11/13/2020

Reducing Inference Latency with Concurrent Architectures for Image Recognition

Satisfying the high computation demand of modern deep learning architect...
research
05/30/2019

On Network Design Spaces for Visual Recognition

Over the past several years progress in designing better neural network ...
research
03/21/2020

Probabilistic Dual Network Architecture Search on Graphs

We present the first differentiable Network Architecture Search (NAS) fo...

Please sign up or login with your details

Forgot password? Click here to reset