Neural Architecture Search without Training

06/08/2020
by   Joseph Mellor, et al.
7

The time and effort involved in hand-designing deep neural networks is immense. This has prompted the development of Neural Architecture Search (NAS) techniques to automate this design. However, NAS algorithms tend to be extremely slow and expensive; they need to train vast numbers of candidate networks to inform the search process. This could be remedied if we could infer a network's trained accuracy from its initial state. In this work, we examine how the linear maps induced by data points correlate for untrained network architectures in the NAS-Bench-201 search space, and motivate how this can be used to give a measure of modelling flexibility which is highly indicative of a network's trained performance. We incorporate this measure into a simple algorithm that allows us to search for powerful networks without any training in a matter of seconds on a single GPU. Code to reproduce our experiments is available at https://github.com/BayesWatch/nas-without-training.

READ FULL TEXT
research
02/16/2021

EPE-NAS: Efficient Performance Estimation Without Training for Neural Architecture Search

Neural Architecture Search (NAS) has shown excellent results in designin...
research
02/23/2021

Neural Architecture Search on ImageNet in Four GPU Hours: A Theoretically Inspired Perspective

Neural Architecture Search (NAS) has been explosively studied to automat...
research
02/12/2021

Neural Architecture Search as Program Transformation Exploration

Improving the performance of deep neural networks (DNNs) is important to...
research
04/22/2019

Towards Learning of Filter-Level Heterogeneous Compression of Convolutional Neural Networks

Recently, deep learning has become a de facto standard in machine learni...
research
01/19/2021

Learning Efficient, Explainable and Discriminative Representations for Pulmonary Nodules Classification

Automatic pulmonary nodules classification is significant for early diag...
research
09/14/2020

RelativeNAS: Relative Neural Architecture Search via Slow-Fast Learning

Despite the remarkable successes of Convolutional Neural Networks (CNNs)...
research
03/07/2021

Auto-tuning of Deep Neural Networks by Conflicting Layer Removal

Designing neural network architectures is a challenging task and knowing...

Please sign up or login with your details

Forgot password? Click here to reset