NAS-OoD: Neural Architecture Search for Out-of-Distribution Generalization

09/05/2021
by   Haoyue Bai, et al.
0

Recent advances on Out-of-Distribution (OoD) generalization reveal the robustness of deep learning models against distribution shifts. However, existing works focus on OoD algorithms, such as invariant risk minimization, domain generalization, or stable learning, without considering the influence of deep model architectures on OoD generalization, which may lead to sub-optimal performance. Neural Architecture Search (NAS) methods search for architecture based on its performance on the training data, which may result in poor generalization for OoD tasks. In this work, we propose robust Neural Architecture Search for OoD generalization (NAS-OoD), which optimizes the architecture with respect to its performance on generated OoD data by gradient descent. Specifically, a data generator is learned to synthesize OoD data by maximizing losses computed by different neural architectures, while the goal for architecture search is to find the optimal architecture parameters that minimize the synthetic OoD data losses. The data generator and the neural architecture are jointly optimized in an end-to-end manner, and the minimax training process effectively discovers robust architectures that generalize well for different distribution shifts. Extensive experimental results show that NAS-OoD achieves superior performance on various OoD generalization benchmarks with deep models having a much fewer number of parameters. In addition, on a real industry dataset, the proposed NAS-OoD method reduces the error rate by more than 70 demonstrating the proposed method's practicality for real applications.

READ FULL TEXT

page 5

page 8

research
05/15/2023

GeNAS: Neural Architecture Search with Better Generalization

Neural Architecture Search (NAS) aims to automatically excavate the opti...
research
04/29/2021

Generalization Guarantees for Neural Architecture Search with Train-Validation Split

Neural Architecture Search (NAS) is a popular method for automatically d...
research
10/01/2019

Blending Diverse Physical Priors with Neural Networks

Machine learning in context of physical systems merits a re-examination ...
research
01/19/2021

GIID-Net: Generalizable Image Inpainting Detection via Neural Architecture Search and Attention

Deep learning (DL) has demonstrated its powerful capabilities in the fie...
research
03/03/2022

β-DARTS: Beta-Decay Regularization for Differentiable Architecture Search

Neural Architecture Search (NAS) has attracted increasingly more attenti...
research
02/19/2020

Neural Architecture Search For Fault Diagnosis

Data-driven methods have made great progress in fault diagnosis, especia...
research
05/19/2022

Incremental Learning with Differentiable Architecture and Forgetting Search

As progress is made on training machine learning models on incrementally...

Please sign up or login with your details

Forgot password? Click here to reset