DeepAI AI Chat
Log In Sign Up

Self-supervised Representation Learning for Evolutionary Neural Architecture Search

by   Chen Wei, et al.

Recently proposed neural architecture search (NAS) algorithms adopt neural predictors to accelerate the architecture search. The capability of neural predictors to accurately predict the performance metrics of neural architecture is critical to NAS, and the acquisition of training datasets for neural predictors is time-consuming. How to obtain a neural predictor with high prediction accuracy using a small amount of training data is a central problem to neural predictor-based NAS. Here, we firstly design a new architecture encoding scheme that overcomes the drawbacks of existing vector-based architecture encoding schemes to calculate the graph edit distance of neural architectures. To enhance the predictive performance of neural predictors, we devise two self-supervised learning methods from different perspectives to pre-train the architecture embedding part of neural predictors to generate a meaningful representation of neural architectures. The first one is to train a carefully designed two branch graph neural network model to predict the graph edit distance of two input neural architectures. The second method is inspired by the prevalently contrastive learning, and we present a new contrastive learning algorithm that utilizes a central feature vector as a proxy to contrast positive pairs against negative pairs. Experimental results illustrate that the pre-trained neural predictors can achieve comparable or superior performance compared with their supervised counterparts with several times less training samples. We achieve state-of-the-art performance on the NASBench-101 and NASBench201 benchmarks when integrating the pre-trained neural predictors with an evolutionary NAS algorithm.


A General-Purpose Transferable Predictor for Neural Architecture Search

Understanding and modelling the performance of neural architectures is k...

DCLP: Neural Architecture Predictor with Curriculum Contrastive Learning

Neural predictors currently show great potential in the performance eval...

Self-supervised Neural Architecture Search

Neural Architecture Search (NAS) has been used recently to achieve impro...

A Novel Training Protocol for Performance Predictors of Evolutionary Neural Architecture Search Algorithms

Evolutionary Neural Architecture Search (ENAS) can automatically design ...

Contrastive Self-supervised Neural Architecture Search

This paper proposes a novel cell-based neural architecture search algori...

Homogeneous Architecture Augmentation for Neural Predictor

Neural Architecture Search (NAS) can automatically design well-performed...

GENNAPE: Towards Generalized Neural Architecture Performance Estimators

Predicting neural architecture performance is a challenging task and is ...