Log In Sign Up

Multi-shot NAS for Discovering Adversarially Robust Convolutional Neural Architectures at Targeted Capacities

by   Xuefei Ning, et al.

Convolutional neural networks (CNNs) are vulnerable to adversarial examples, and studies show that increasing the model capacity of an architecture topology (e.g., width expansion) can bring consistent robustness improvements. This reveals a clear robustness-efficiency trade-off that should be considered in architecture design. Recent studies have employed one-shot neural architecture search (NAS) to discover adversarially robust architectures. However, since the capacities of different topologies cannot be easily aligned during the search process, current one-shot NAS methods might favor topologies with larger capacity in the supernet. And the discovered topology might be sub-optimal when aligned to the targeted capacity. This paper proposes a novel multi-shot NAS method to explicitly search for adversarially robust architectures at a certain targeted capacity. Specifically, we estimate the reward at the targeted capacity using interior extra-polation of the rewards from multiple supernets. Experimental results demonstrate the effectiveness of the proposed method. For instance, at the targeted FLOPs of 1560M, the discovered MSRobNet-1560 (clean 84.8 RobNet-free (clean 82.8


An Empirical Study on the Robustness of NAS based Architectures

Most existing methods for Neural Architecture Search (NAS) focus on achi...

Powering One-shot Topological NAS with Stabilized Share-parameter Proxy

One-shot NAS method has attracted much interest from the research commun...

Guided Evolutionary Neural Architecture Search With Efficient Performance Estimation

Neural Architecture Search (NAS) methods have been successfully applied ...

Sub-Architecture Ensemble Pruning in Neural Architecture Search

Neural architecture search (NAS) is gaining more and more attention in r...

Understanding and Improving One-shot Neural Architecture Optimization

The ability of accurately ranking candidate architectures is the key to ...

RelativeNAS: Relative Neural Architecture Search via Slow-Fast Learning

Despite the remarkable successes of Convolutional Neural Networks (CNNs)...

FTT-NAS: Discovering Fault-Tolerant Neural Architecture

With the fast evolvement of embedded deep-learning computing systems, ap...