Adversarial Examples in Random Neural Networks with General Activations

03/31/2022
by   Andrea Montanari, et al.
0

A substantial body of empirical work documents the lack of robustness in deep learning models to adversarial examples. Recent theoretical work proved that adversarial examples are ubiquitous in two-layers networks with sub-exponential width and ReLU or smooth activations, and multi-layer ReLU networks with sub-exponential width. We present a result of the same type, with no restriction on width and for general locally Lipschitz continuous activations. More precisely, given a neural network f( · ;θ) with random weights θ, and feature vector x, we show that an adversarial example x' can be found with high probability along the direction of the gradient ∇_ xf( x;θ). Our proof is based on a Gaussian conditioning technique. Instead of proving that f is approximately linear in a neighborhood of x, we characterize the joint distribution of f( x;θ) and f( x';θ) for x' = x-s( x)∇_ xf( x;θ).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/23/2021

Adversarial Examples in Multi-Layer Random ReLU Networks

We consider the phenomenon of adversarial examples in ReLU networks with...
research
04/08/2021

A single gradient step finds adversarial examples on random two-layers neural networks

Daniely and Schacham recently showed that gradient descent finds adversa...
research
06/07/2022

Adversarial Reprogramming Revisited

Adversarial reprogramming, introduced by Elsayed, Goodfellow, and Sohl-D...
research
10/28/2020

Most ReLU Networks Suffer from ℓ^2 Adversarial Perturbations

We consider ReLU networks with random weights, in which the dimension de...
research
07/31/2021

The Separation Capacity of Random Neural Networks

Neural networks with random weights appear in a variety of machine learn...
research
05/22/2018

A Tropical Approach to Neural Networks with Piecewise Linear Activations

We present a new, unifying approach following some recent developments o...
research
12/06/2018

Singular Values for ReLU Layers

Despite their prevalence in neural networks we still lack a thorough the...

Please sign up or login with your details

Forgot password? Click here to reset