Bridged Adversarial Training

08/25/2021
by   Hoki Kim, et al.
0

Adversarial robustness is considered as a required property of deep neural networks. In this study, we discover that adversarially trained models might have significantly different characteristics in terms of margin and smoothness, even they show similar robustness. Inspired by the observation, we investigate the effect of different regularizers and discover the negative effect of the smoothness regularizer on maximizing the margin. Based on the analyses, we propose a new method called bridged adversarial training that mitigates the negative effect by bridging the gap between clean and adversarial examples. We provide theoretical and empirical evidence that the proposed method provides stable and better robustness, especially for large perturbations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/23/2022

Collaborative Adversarial Training

The vulnerability of deep neural networks (DNNs) to adversarial examples...
research
05/24/2018

Laplacian Power Networks: Bounding Indicator Function Smoothness for Adversarial Defense

Deep Neural Networks often suffer from lack of robustness to adversarial...
research
05/22/2019

Convergence and Margin of Adversarial Training on Separable Data

Adversarial training is a technique for training robust machine learning...
research
11/23/2018

Robustness via curvature regularization, and vice versa

State-of-the-art classifiers have been shown to be largely vulnerable to...
research
10/22/2020

Adversarial Robustness of Supervised Sparse Coding

Several recent results provide theoretical insights into the phenomena o...
research
04/01/2023

Improving Fast Adversarial Training with Prior-Guided Knowledge

Fast adversarial training (FAT) is an efficient method to improve robust...
research
02/22/2019

On the Sensitivity of Adversarial Robustness to Input Data Distributions

Neural networks are vulnerable to small adversarial perturbations. Exist...

Please sign up or login with your details

Forgot password? Click here to reset