Training-free Neural Architecture Search for RNNs and Transformers

06/01/2023
by   Aaron Serianni, et al.
0

Neural architecture search (NAS) has allowed for the automatic creation of new and effective neural network architectures, offering an alternative to the laborious process of manually designing complex architectures. However, traditional NAS algorithms are slow and require immense amounts of computing power. Recent research has investigated training-free NAS metrics for image classification architectures, drastically speeding up search algorithms. In this paper, we investigate training-free NAS metrics for recurrent neural network (RNN) and BERT-based transformer architectures, targeted towards language modeling tasks. First, we develop a new training-free metric, named hidden covariance, that predicts the trained performance of an RNN architecture and significantly outperforms existing training-free metrics. We experimentally evaluate the effectiveness of the hidden covariance metric on the NAS-Bench-NLP benchmark. Second, we find that the current search space paradigm for transformer architectures is not optimized for training-free neural architecture search. Instead, a simple qualitative analysis can effectively shrink the search space to the best performing architectures. This conclusion is based on our investigation of existing training-free metrics and new metrics developed from recent transformer pruning literature, evaluated on our own benchmark of trained BERT architectures. Ultimately, our analysis shows that the architecture search space and the training-free metric must be developed together in order to achieve effective results.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/05/2022

Search Space Adaptation for Differentiable Neural Architecture Search in Image Classification

As deep neural networks achieve unprecedented performance in various tas...
research
01/29/2020

Bayesian Neural Architecture Search using A Training-Free Performance Metric

Recurrent neural networks (RNNs) are a powerful approach for time series...
research
09/22/2020

AutoRC: Improving BERT Based Relation Classification Models via Architecture Search

Although BERT based relation classification (RC) models have achieved si...
research
03/23/2022

Training-free Transformer Architecture Search

Recently, Vision Transformer (ViT) has achieved remarkable success in se...
research
01/24/2022

Unifying and Boosting Gradient-Based Training-Free Neural Architecture Search

Neural architecture search (NAS) has gained immense popularity owing to ...
research
12/23/2022

DAS: Neural Architecture Search via Distinguishing Activation Score

Neural Architecture Search (NAS) is an automatic technique that can sear...
research
08/15/2020

Finding Fast Transformers: One-Shot Neural Architecture Search by Component Composition

Transformer-based models have achieved stateof-the-art results in many t...

Please sign up or login with your details

Forgot password? Click here to reset