Improving Differentiable Architecture Search with a Generative Model

11/30/2021
by   Ruisi Zhang, et al.
0

In differentiable neural architecture search (NAS) algorithms like DARTS, the training set used to update model weight and the validation set used to update model architectures are sampled from the same data distribution. Thus, the uncommon features in the dataset fail to receive enough attention during training. In this paper, instead of introducing more complex NAS algorithms, we explore the idea that adding quality synthesized datasets into training can help the classification model identify its weakness and improve recognition accuracy. We introduce a training strategy called “Differentiable Architecture Search with a Generative Model(DASGM)." In DASGM, the training set is used to update the classification model weight, while a synthesized dataset is used to train its architecture. The generated images have different distributions from the training set, which can help the classification model learn better features to identify its weakness. We formulate DASGM into a multi-level optimization framework and develop an effective algorithm to solve it. Experiments on CIFAR-10, CIFAR-100, and ImageNet have demonstrated the effectiveness of DASGM. Code will be made available.

READ FULL TEXT
research
11/25/2021

BaLeNAS: Differentiable Architecture Search via the Bayesian Learning Rule

Differentiable Architecture Search (DARTS) has received massive attentio...
research
08/10/2020

RARTS: a Relaxed Architecture Search Method

Differentiable architecture search (DARTS) is an effective method for da...
research
08/20/2021

D-DARTS: Distributed Differentiable Architecture Search

Differentiable ARchiTecture Search (DARTS) is one of the most trending N...
research
12/16/2019

UNAS: Differentiable Architecture Search Meets Reinforcement Learning

Neural architecture search (NAS) aims to discover network architectures ...
research
09/18/2020

Faster Gradient-based NAS Pipeline Combining Broad Scalable Architecture with Confident Learning Rate

In order to further improve the search efficiency of Neural Architecture...
research
12/01/2021

Learning from Mistakes based on Class Weighting with Application to Neural Architecture Search

Learning from mistakes is an effective learning approach widely used in ...
research
12/03/2021

Data-Free Neural Architecture Search via Recursive Label Calibration

This paper aims to explore the feasibility of neural architecture search...

Please sign up or login with your details

Forgot password? Click here to reset