Siamese-NAS: Using Trained Samples Efficiently to Find Lightweight Neural Architecture by Prior Knowledge

10/02/2022
by   Yu-Ming Zhang, et al.
0

In the past decade, many architectures of convolution neural networks were designed by handcraft, such as Vgg16, ResNet, DenseNet, etc. They all achieve state-of-the-art level on different tasks in their time. However, it still relies on human intuition and experience, and it also takes so much time consumption for trial and error. Neural Architecture Search (NAS) focused on this issue. In recent works, the Neural Predictor has significantly improved with few training architectures as training samples. However, the sampling efficiency is already considerable. In this paper, our proposed Siamese-Predictor is inspired by past works of predictor-based NAS. It is constructed with the proposed Estimation Code, which is the prior knowledge about the training procedure. The proposed Siamese-Predictor gets significant benefits from this idea. This idea causes it to surpass the current SOTA predictor on NASBench-201. In order to explore the impact of the Estimation Code, we analyze the relationship between it and accuracy. We also propose the search space Tiny-NanoBench for lightweight CNN architecture. This well-designed search space is easier to find better architecture with few FLOPs than NASBench-201. In summary, the proposed Siamese-Predictor is a predictor-based NAS. It achieves the SOTA level, especially with limited computation budgets. It applied to the proposed Tiny-NanoBench can just use a few trained samples to find extremely lightweight CNN architecture.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/28/2020

NPENAS: Neural Predictor Guided Evolution for Neural Architecture Search

Neural architecture search (NAS) is a promising method for automatically...
research
05/07/2023

RATs-NAS: Redirection of Adjacent Trails on GCN for Neural Architecture Search

Various hand-designed CNN architectures have been developed, such as VGG...
research
10/26/2022

PredNAS: A Universal and Sample Efficient Neural Architecture Search Framework

In this paper, we present a general and effective framework for Neural A...
research
09/30/2019

RNAS: Architecture Ranking for Powerful Networks

Neural Architecture Search (NAS) is attractive for automatically produci...
research
02/21/2021

Weak NAS Predictors Are All You Need

Neural Architecture Search (NAS) finds the best network architecture by ...
research
01/31/2023

NASiam: Efficient Representation Learning using Neural Architecture Search for Siamese Networks

Siamese networks are one of the most trending methods to achieve self-su...
research
03/08/2022

UENAS: A Unified Evolution-based NAS Framework

Neural architecture search (NAS) has gained significant attention for au...

Please sign up or login with your details

Forgot password? Click here to reset