Adversarially Robust Neural Architectures

09/02/2020
by   Minjing Dong, et al.
17

Deep Neural Network (DNN) are vulnerable to adversarial attack. Existing methods are devoted to developing various robust training strategies or regularizations to update the weights of the neural network. But beyond the weights, the overall structure and information flow in the network are explicitly determined by the neural architecture, which remains unexplored. This paper thus aims to improve the adversarial robustness of the network from the architecture perspective with NAS framework. We explore the relationship among adversarial robustness, Lipschitz constant, and architecture parameters and show that an appropriate constraint on architecture parameters could reduce the Lipschitz constant to further improve the robustness. For NAS framework, all the architecture parameters are equally treated when the discrete architecture is sampled from supernet. However, the importance of architecture parameters could vary from operation to operation or connection to connection, which is not explored and might reduce the confidence of robust architecture sampling. Thus, we propose to sample architecture parameters from trainable multivariate log-normal distributions, with which the Lipschitz constant of entire network can be approximated using a univariate log-normal distribution with mean and variance related to architecture parameters. Compared with adversarially trained neural architectures searched by various NAS algorithms as well as efficient human-designed models, our algorithm empirically achieves the best performance among all the models under various attacks on different datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/16/2020

An Empirical Study on the Robustness of NAS based Architectures

Most existing methods for Neural Architecture Search (NAS) focus on achi...
research
06/01/2022

The robust way to stack and bag: the local Lipschitz way

Recent research has established that the local Lipschitz constant of a n...
research
05/12/2023

Efficient Search of Comprehensively Robust Neural Architectures via Multi-fidelity Evaluation

Neural architecture search (NAS) has emerged as one successful technique...
research
08/03/2021

AdvRush: Searching for Adversarially Robust Neural Architectures

Deep neural networks continue to awe the world with their remarkable per...
research
11/25/2019

When NAS Meets Robustness: In Search of Robust Architectures against Adversarial Attacks

Recent advances in adversarial attacks uncover the intrinsic vulnerabili...
research
10/23/2021

Towards a Robust Differentiable Architecture Search under Label Noise

Neural Architecture Search (NAS) is the game changer in designing robust...
research
11/15/2022

NAR-Former: Neural Architecture Representation Learning towards Holistic Attributes Prediction

With the wide and deep adoption of deep learning models in real applicat...

Please sign up or login with your details

Forgot password? Click here to reset