WPNAS: Neural Architecture Search by jointly using Weight Sharing and Predictor

03/04/2022
by   Ke Lin, et al.
0

Weight sharing based and predictor based methods are two major types of fast neural architecture search methods. In this paper, we propose to jointly use weight sharing and predictor in a unified framework. First, we construct a SuperNet in a weight-sharing way and probabilisticly sample architectures from the SuperNet. To increase the correctness of the evaluation of architectures, besides direct evaluation using the inherited weights, we further apply a few-shot predictor to assess the architecture on the other hand. The final evaluation of the architecture is the combination of direct evaluation, the prediction from the predictor and the cost of the architecture. We regard the evaluation as a reward and apply a self-critical policy gradient approach to update the architecture probabilities. To further reduce the side effects of weight sharing, we propose a weakly weight sharing method by introducing another HyperNet. We conduct experiments on datasets including CIFAR-10, CIFAR-100 and ImageNet under NATS-Bench, DARTS and MobileNet search space. The proposed WPNAS method achieves state-of-the-art performance on these datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/07/2020

A Surgery of the Neural Architecture Evaluators

Neural architecture search (NAS) recently received extensive attention d...
research
08/12/2021

DARTS for Inverse Problems: a Study on Hyperparameter Sensitivity

Differentiable architecture search (DARTS) is a widely researched tool f...
research
12/02/2019

Neural Predictor for Neural Architecture Search

Neural Architecture Search methods are effective but often use complex a...
research
07/05/2023

Dynamical Isometry based Rigorous Fair Neural Architecture Search

Recently, the weight-sharing technique has significantly speeded up the ...
research
02/09/2018

Efficient Neural Architecture Search via Parameters Sharing

We propose Efficient Neural Architecture Search (ENAS), a fast and inexp...
research
04/08/2019

WeNet: Weighted Networks for Recurrent Network Architecture Search

In recent years, there has been increasing demand for automatic architec...
research
10/08/2022

Unified Probabilistic Neural Architecture and Weight Ensembling Improves Model Robustness

Robust machine learning models with accurately calibrated uncertainties ...

Please sign up or login with your details

Forgot password? Click here to reset