Light and Accurate: Neural Architecture Search via Two Constant Shared Weights Initialisations

02/09/2023
by   Ekaterina Gracheva, et al.
0

In recent years, zero-cost proxies are gaining ground in neural architecture search (NAS). These methods allow finding the optimal neural network for a given task faster and with a lesser computational load than conventional NAS methods. Equally important is the fact that they also shed some light on the internal workings of neural architectures. This paper presents a zero-cost metric that highly correlates with the train set accuracy across the NAS-Bench-101, NAS-Bench-201 and NAS-Bench-NLP benchmark datasets. Architectures are initialised with two distinct constant shared weights, one at a time. Then, a fixed random mini-batch of data is passed forward through each initialisation. We observe that the dispersion of the outputs between two initialisations positively correlates with trained accuracy. The correlation further improves when we normalise dispersion by average output magnitude. Our metric, epsilon, does not require gradients computation or labels. It thus unbinds the NAS procedure from training hyperparameters, loss metrics and human-labelled data. Our method is easy to integrate within existing NAS algorithms and takes a fraction of a second to evaluate a single network.

READ FULL TEXT
research
04/04/2023

Data Aware Neural Architecture Search

Neural Architecture Search (NAS) is a popular tool for automatically gen...
research
11/11/2020

Efficient Neural Architecture Search for End-to-end Speech Recognition via Straight-Through Gradients

Neural Architecture Search (NAS), the process of automating architecture...
research
12/23/2022

DAS: Neural Architecture Search via Distinguishing Activation Score

Neural Architecture Search (NAS) is an automatic technique that can sear...
research
03/24/2020

BigNAS: Scaling Up Neural Architecture Search with Big Single-Stage Models

Neural architecture search (NAS) has shown promising results discovering...
research
04/29/2021

Generalization Guarantees for Neural Architecture Search with Train-Validation Split

Neural Architecture Search (NAS) is a popular method for automatically d...
research
10/19/2021

NAS-HPO-Bench-II: A Benchmark Dataset on Joint Optimization of Convolutional Neural Network Architecture and Training Hyperparameters

The benchmark datasets for neural architecture search (NAS) have been de...
research
07/18/2023

An Evaluation of Zero-Cost Proxies – from Neural Architecture Performance to Model Robustness

Zero-cost proxies are nowadays frequently studied and used to search for...

Please sign up or login with your details

Forgot password? Click here to reset