Architecture Augmentation for Performance Predictor Based on Graph Isomorphism

07/03/2022
by   Xiangning Xie, et al.
0

Neural Architecture Search (NAS) can automatically design architectures for deep neural networks (DNNs) and has become one of the hottest research topics in the current machine learning community. However, NAS is often computationally expensive because a large number of DNNs require to be trained for obtaining performance during the search process. Performance predictors can greatly alleviate the prohibitive cost of NAS by directly predicting the performance of DNNs. However, building satisfactory performance predictors highly depends on enough trained DNN architectures, which are difficult to obtain in most scenarios. To solve this critical issue, we propose an effective DNN architecture augmentation method named GIAug in this paper. Specifically, we first propose a mechanism based on graph isomorphism, which has the merit of efficiently generating a factorial of n (i.e., n!) diverse annotated architectures upon a single architecture having n nodes. In addition, we also design a generic method to encode the architectures into the form suitable to most prediction models. As a result, GIAug can be flexibly utilized by various existing performance predictors-based NAS algorithms. We perform extensive experiments on CIFAR-10 and ImageNet benchmark datasets on small-, medium- and large-scale search space. The experiments show that GIAug can significantly enhance the performance of most state-of-the-art peer predictors. In addition, GIAug can save three magnitude order of computation cost at most on ImageNet yet with similar performance when compared with state-of-the-art NAS algorithms.

READ FULL TEXT
research
07/28/2021

Homogeneous Architecture Augmentation for Neural Predictor

Neural Architecture Search (NAS) can automatically design well-performed...
research
08/30/2020

A Novel Training Protocol for Performance Predictors of Evolutionary Neural Architecture Search Algorithms

Evolutionary Neural Architecture Search (ENAS) can automatically design ...
research
08/06/2021

AceNAS: Learning to Rank Ace Neural Architectures with Weak Supervision of Weight Sharing

Architecture performance predictors have been widely used in neural arch...
research
02/21/2021

Weak NAS Predictors Are All You Need

Neural Architecture Search (NAS) finds the best network architecture by ...
research
10/31/2020

Self-supervised Representation Learning for Evolutionary Neural Architecture Search

Recently proposed neural architecture search (NAS) algorithms adopt neur...
research
04/23/2020

Depth-Wise Neural Architecture Search

Modern convolutional networks such as ResNet and NASNet have achieved st...
research
10/31/2019

NAT: Neural Architecture Transformer for Accurate and Compact Architectures

Designing effective architectures is one of the key factors behind the s...

Please sign up or login with your details

Forgot password? Click here to reset