Understanding Neural Architecture Search Techniques

03/31/2019
by   George Adam, et al.
1

Automatic methods for generating state-of-the-art neural network architectures without human experts have generated significant attention recently. This is because of the potential to remove human experts from the design loop which can reduce costs and decrease time to model deployment. Neural architecture search (NAS) techniques have improved significantly in their computational efficiency since the original NAS was proposed. This reduction in computation is enabled via weight sharing such as in Efficient Neural Architecture Search (ENAS). However, recently a body of work confirms our discovery that ENAS does not do significantly better than random search with weight sharing, contradicting the initial claims of the authors. We provide an explanation for this phenomenon by investigating the interpretability of the ENAS controller's hidden state. We are interested in seeing if the controller embeddings are predictive of any properties of the final architecture - for example, graph properties like the number of connections, or validation performance. We find models sampled from identical controller hidden states have no correlation in various graph similarity metrics. This failure mode implies the RNN controller does not condition on past architecture choices. Importantly, we may need to condition on past choices if certain connection patterns prevent vanishing or exploding gradients. Lastly, we propose a solution to this failure mode by forcing the controller's hidden state to encode pasts decisions by training it with a memory buffer of previously sampled architectures. Doing this improves hidden state interpretability by increasing the correlation controller hidden states and graph similarity metrics.

READ FULL TEXT
research
09/24/2020

Disentangled Neural Architecture Search

Neural architecture search has shown its great potential in various area...
research
12/16/2022

From Xception to NEXcepTion: New Design Decisions and Neural Architecture Search

In this paper, we present a modified Xception architecture, the NEXcepTi...
research
07/22/2019

Efficient Novelty-Driven Neural Architecture Search

One-Shot Neural architecture search (NAS) attracts broad attention recen...
research
01/19/2021

ES-ENAS: Combining Evolution Strategies with Neural Architecture Search at No Extra Cost for Reinforcement Learning

We introduce ES-ENAS, a simple neural architecture search (NAS) algorith...
research
02/09/2018

Efficient Neural Architecture Search via Parameters Sharing

We propose Efficient Neural Architecture Search (ENAS), a fast and inexp...
research
07/05/2023

Dynamical Isometry based Rigorous Fair Neural Architecture Search

Recently, the weight-sharing technique has significantly speeded up the ...
research
04/17/2020

Fitting the Search Space of Weight-sharing NAS with Graph Convolutional Networks

Neural architecture search has attracted wide attentions in both academi...

Please sign up or login with your details

Forgot password? Click here to reset