Mathematical Analysis of Adversarial Attacks

11/15/2018
by   Zehao Dou, et al.
17

In this paper, we analyze efficacy of the fast gradient sign method (FGSM) and the Carlini-Wagner's L2 (CW-L2) attack. We prove that, within a certain regime, the untargeted FGSM can fool any convolutional neural nets (CNNs) with ReLU activation; the targeted FGSM can mislead any CNNs with ReLU activation to classify any given image into any prescribed class. For a special two-layer neural network: a linear layer followed by the softmax output activation, we show that the CW-L2 attack increases the ratio of the classification probability between the target and ground truth classes. Moreover, we provide numerical results to verify all our theoretical results.

READ FULL TEXT
research
02/10/2020

On Approximation Capabilities of ReLU Activation and Softmax Output Layer in Neural Networks

In this paper, we have extended the well-established universal approxima...
research
02/15/2022

Unreasonable Effectiveness of Last Hidden Layer Activations

In standard Deep Neural Network (DNN) based classifiers, the general con...
research
11/12/2019

Tight Sample Complexity of Learning One-hidden-layer Convolutional Neural Networks

We study the sample complexity of learning one-hidden-layer convolutiona...
research
11/03/2021

A Johnson–Lindenstrauss Framework for Randomly Initialized CNNs

How does the geometric representation of a dataset change after the appl...
research
09/19/2017

Training Better CNNs Requires to Rethink ReLU

With the rapid development of Deep Convolutional Neural Networks (DCNNs)...
research
11/23/2022

Dual Graphs of Polyhedral Decompositions for the Detection of Adversarial Attacks

Previous work has shown that a neural network with the rectified linear ...
research
05/02/2023

Hamming Similarity and Graph Laplacians for Class Partitioning and Adversarial Image Detection

Researchers typically investigate neural network representations by exam...

Please sign up or login with your details

Forgot password? Click here to reset