Graph HyperNetworks for Neural Architecture Search

10/12/2018
by   Chris Zhang, et al.
0

Neural architecture search (NAS) automatically finds the best task-specific neural network topology, outperforming many manual architecture designs. However, it can be prohibitively expensive as the search requires training thousands of different networks, while each can last for hours. In this work, we propose the Graph HyperNetwork (GHN) to amortize the search cost: given an architecture, it directly generates the weights by running inference on a graph neural network. GHNs model the topology of an architecture and therefore can predict network performance more accurately than regular hypernetworks and premature early stopping. To perform NAS, we randomly sample architectures and use the validation accuracy of networks with GHN generated weights as the surrogate search signal. GHNs are fast -- they can search nearly 10 times faster than other random search methods on CIFAR-10 and ImageNet. GHNs can be further extended to the anytime prediction setting, where they have found networks with better speed-accuracy tradeoff than the state-of-the-art manual designs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/19/2020

Neural Architecture Performance Prediction Using Graph Neural Networks

In computer vision research, the process of automating architecture engi...
research
03/17/2021

The Untapped Potential of Off-the-Shelf Convolutional Neural Networks

Over recent years, a myriad of novel convolutional network architectures...
research
11/11/2020

Towards NNGP-guided Neural Architecture Search

The predictions of wide Bayesian neural networks are described by a Gaus...
research
06/13/2022

EmProx: Neural Network Performance Estimation For Neural Architecture Search

Common Neural Architecture Search methods generate large amounts of cand...
research
04/23/2021

Inter-choice dependent super-network weights

The automatic design of architectures for neural networks, Neural Archit...
research
12/02/2017

Progressive Neural Architecture Search

We propose a method for learning CNN structures that is more efficient t...
research
09/29/2020

MS-RANAS: Multi-Scale Resource-Aware Neural Architecture Search

Neural Architecture Search (NAS) has proved effective in offering outper...

Please sign up or login with your details

Forgot password? Click here to reset