A3T: Adversarially Augmented Adversarial Training

01/12/2018
by   Akram Erraqabi, et al.
0

Recent research showed that deep neural networks are highly sensitive to so-called adversarial perturbations, which are tiny perturbations of the input data purposely designed to fool a machine learning classifier. Most classification models, including deep learning models, are highly vulnerable to adversarial attacks. In this work, we investigate a procedure to improve adversarial robustness of deep neural networks through enforcing representation invariance. The idea is to train the classifier jointly with a discriminator attached to one of its hidden layer and trained to filter the adversarial noise. We perform preliminary experiments to test the viability of the approach and to compare it to other standard adversarial training methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/10/2018

Defending against Universal Perturbations with Shared Adversarial Training

Classifiers such as deep neural networks have been shown to be vulnerabl...
research
08/26/2021

Understanding the Logit Distributions of Adversarially-Trained Deep Neural Networks

Adversarial defenses train deep neural networks to be invariant to the i...
research
10/17/2019

Enforcing Linearity in DNN succours Robustness and Adversarial Image Generation

Recent studies on the adversarial vulnerability of neural networks have ...
research
03/09/2022

Robust Federated Learning Against Adversarial Attacks for Speech Emotion Recognition

Due to the development of machine learning and speech processing, speech...
research
07/10/2020

Improving Adversarial Robustness by Enforcing Local and Global Compactness

The fact that deep neural networks are susceptible to crafted perturbati...
research
03/30/2020

Towards Deep Learning Models Resistant to Large Perturbations

Adversarial robustness has proven to be a required property of machine l...
research
07/26/2018

HiDDeN: Hiding Data With Deep Networks

Recent work has shown that deep neural networks are highly sensitive to ...

Please sign up or login with your details

Forgot password? Click here to reset