Data-Free Neural Architecture Search via Recursive Label Calibration

12/03/2021
by   Zechun Liu, et al.
1

This paper aims to explore the feasibility of neural architecture search (NAS) given only a pre-trained model without using any original training data. This is an important circumstance for privacy protection, bias avoidance, etc., in real-world scenarios. To achieve this, we start by synthesizing usable data through recovering the knowledge from a pre-trained deep neural network. Then we use the synthesized data and their predicted soft-labels to guide neural architecture search. We identify that the NAS task requires the synthesized data (we target at image domain here) with enough semantics, diversity, and a minimal domain gap from the natural images. For semantics, we propose recursive label calibration to produce more informative outputs. For diversity, we propose a regional update strategy to generate more diverse and semantically-enriched synthetic data. For minimal domain gap, we use input and feature-level regularization to mimic the original data distribution in latent space. We instantiate our proposed framework with three popular NAS algorithms: DARTS, ProxylessNAS and SPOS. Surprisingly, our results demonstrate that the architectures discovered by searching with our synthetic data achieve accuracy that is comparable to, or even higher than, architectures discovered by searching from the original ones, for the first time, deriving the conclusion that NAS can be done effectively with no need of access to the original or called natural data if the synthesis method is well designed. Our code will be publicly available.

READ FULL TEXT

page 6

page 14

research
05/09/2023

GPT-NAS: Neural Architecture Search with the Generative Pre-Trained Model

Neural Architecture Search (NAS) has emerged as one of the effective met...
research
01/28/2021

Neural Architecture Search with Random Labels

In this paper, we investigate a new variant of neural architecture searc...
research
05/25/2021

AutoReCon: Neural Architecture Search-based Reconstruction for Data-free Compression

Data-free compression raises a new challenge because the original traini...
research
10/01/2019

Sub-Architecture Ensemble Pruning in Neural Architecture Search

Neural architecture search (NAS) is gaining more and more attention in r...
research
05/27/2020

Synthetic Petri Dish: A Novel Surrogate Model for Rapid Architecture Search

Neural Architecture Search (NAS) explores a large space of architectural...
research
11/30/2021

Improving Differentiable Architecture Search with a Generative Model

In differentiable neural architecture search (NAS) algorithms like DARTS...
research
03/02/2021

Task-Adaptive Neural Network Retrieval with Meta-Contrastive Learning

Most conventional Neural Architecture Search (NAS) approaches are limite...

Please sign up or login with your details

Forgot password? Click here to reset