GNAS: A Generalized Neural Network Architecture Search Framework

03/19/2021
by   Dige Ai, et al.
0

In practice, the problems encountered in training NAS (Neural Architecture Search) are not simplex, but a series of combinations of difficulties are often faced(incorrect compensation estimation, curse of dimension, overfitting, high complexity, etc.). From the point of view for solving practical problems, this paper makes reference and improvement to the previous researches which only solve the single problem of NAS, and combines them into a practical technology flow. This paper propose a framework that decouples the network structure from the search space for operators. We use two BOHBs(Bayesian Optimization Hyperband) to search alternately in the vast network structure and operator search space. And then, we trained a GCN-baesd predictor using the feedback of the child model. This approach takes care of the dimension curse while improving efficiency. Considering that activation function and initialization are also important components of neural network, and can affect the generalization ability of the model. This paper introduced an activation function and an initialization method domain, join them to the operator search space to form a generalized search space, thus improving the generalization ability of the child model. At last, We applied our framework to neural architecture search and achieved significant improvements on multiple datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/07/2023

RATs-NAS: Redirection of Adjacent Trails on GCN for Neural Architecture Search

Various hand-designed CNN architectures have been developed, such as VGG...
research
05/07/2019

Neural Architecture Refinement: A Practical Way for Avoiding Overfitting in NAS

Neural architecture search (NAS) is proposed to automate the architectur...
research
11/28/2022

PIDS: Joint Point Interaction-Dimension Search for 3D Point Cloud

The interaction and dimension of points are two important axes in design...
research
11/24/2020

Efficient Sampling for Predictor-Based Neural Architecture Search

Recently, predictor-based algorithms emerged as a promising approach for...
research
09/30/2022

BayesFT: Bayesian Optimization for Fault Tolerant Neural Network Architecture

To deploy deep learning algorithms on resource-limited scenarios, an eme...
research
08/26/2019

On the Bounds of Function Approximations

Within machine learning, the subfield of Neural Architecture Search (NAS...

Please sign up or login with your details

Forgot password? Click here to reset