Learning Diverse-Structured Networks for Adversarial Robustness

02/03/2021
by   Xuefeng Du, et al.
1

In adversarial training (AT), the main focus has been the objective and optimizer while the model has been less studied, so that the models being used are still those classic ones in standard training (ST). Classic network architectures (NAs) are generally worse than searched NAs in ST, which should be the same in AT. In this paper, we argue that NA and AT cannot be handled independently, since given a dataset, the optimal NA in ST would be no longer optimal in AT. That being said, AT is time-consuming itself; if we directly search NAs in AT over large search spaces, the computation will be practically infeasible. Thus, we propose a diverse-structured network (DS-Net), to significantly reduce the size of the search space: instead of low-level operations, we only consider predefined atomic blocks, where an atomic block is a time-tested building block like the residual block. There are only a few atomic blocks and thus we can weight all atomic blocks rather than find the best one in a searched block of DS-Net, which is an essential trade-off between exploring diverse structures and exploiting the best structures. Empirical results demonstrate the advantages of DS-Net, i.e., weighting the atomic blocks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/20/2019

AtomNAS: Fine-Grained End-to-End Neural Architecture Search

Designing of search space is a critical problem for neural architecture ...
research
10/13/2022

BLOX: Macro Neural Architecture Search Benchmark and Algorithms

Neural architecture search (NAS) has been successfully used to design nu...
research
02/28/2021

Tiny Adversarial Mulit-Objective Oneshot Neural Architecture Search

Due to limited computational cost and energy consumption, most neural ne...
research
06/08/2023

Generalizable Lightweight Proxy for Robust NAS against Diverse Perturbations

Recent neural architecture search (NAS) frameworks have been successful ...
research
10/04/2021

An Analysis of Super-Net Heuristics in Weight-Sharing NAS

Weight sharing promises to make neural architecture search (NAS) tractab...
research
12/23/2022

DAS: Neural Architecture Search via Distinguishing Activation Score

Neural Architecture Search (NAS) is an automatic technique that can sear...

Please sign up or login with your details

Forgot password? Click here to reset