RATs-NAS: Redirection of Adjacent Trails on GCN for Neural Architecture Search

05/07/2023
by   Yu-Ming Zhang, et al.
0

Various hand-designed CNN architectures have been developed, such as VGG, ResNet, DenseNet, etc., and achieve State-of-the-Art (SoTA) levels on different tasks. Neural Architecture Search (NAS) now focuses on automatically finding the best CNN architecture to handle the above tasks. However, the verification of a searched architecture is very time-consuming and makes predictor-based methods become an essential and important branch of NAS. Two commonly used techniques to build predictors are graph-convolution networks (GCN) and multilayer perceptron (MLP). In this paper, we consider the difference between GCN and MLP on adjacent operation trails and then propose the Redirected Adjacent Trails NAS (RATs-NAS) to quickly search for the desired neural network architecture. The RATs-NAS consists of two components: the Redirected Adjacent Trails GCN (RATs-GCN) and the Predictor-based Search Space Sampling (P3S) module. RATs-GCN can change trails and their strengths to search for a better neural network architecture. P3S can rapidly focus on tighter intervals of FLOPs in the search space. Based on our observations on cell-based NAS, we believe that architectures with similar FLOPs will perform similarly. Finally, the RATs-NAS consisting of RATs-GCN and P3S beats WeakNAS, Arch-Graph, and others by a significant margin on three sub-datasets of NASBench-201.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/14/2018

Evolutionary Neural Architecture Search for Image Restoration

Convolutional neural network (CNN) architectures have traditionally been...
research
03/19/2021

GNAS: A Generalized Neural Network Architecture Search Framework

In practice, the problems encountered in training NAS (Neural Architectu...
research
10/02/2022

Siamese-NAS: Using Trained Samples Efficiently to Find Lightweight Neural Architecture by Prior Knowledge

In the past decade, many architectures of convolution neural networks we...
research
01/30/2022

Neural Architecture Ranker

Architecture ranking has recently been advocated to design an efficient ...
research
04/15/2022

Efficient Architecture Search for Diverse Tasks

While neural architecture search (NAS) has enabled automated machine lea...
research
10/25/2022

NAS-PRNet: Neural Architecture Search generated Phase Retrieval Net for Off-axis Quantitative Phase Imaging

Single neural networks have achieved simultaneous phase retrieval with a...
research
05/18/2022

A Classification of G-invariant Shallow Neural Networks

When trying to fit a deep neural network (DNN) to a G-invariant target f...

Please sign up or login with your details

Forgot password? Click here to reset