Adversarial Examples from Cryptographic Pseudo-Random Generators

11/15/2018
by   Sébastien Bubeck, et al.
0

In our recent work (Bubeck, Price, Razenshteyn, arXiv:1805.10204) we argued that adversarial examples in machine learning might be due to an inherent computational hardness of the problem. More precisely, we constructed a binary classification task for which (i) a robust classifier exists; yet no non-trivial accuracy can be obtained with an efficient algorithm in (ii) the statistical query model. In the present paper we significantly strengthen both (i) and (ii): we now construct a task which admits (i') a maximally robust classifier (that is it can tolerate perturbations of size comparable to the size of the examples themselves); and moreover we prove computational hardness of learning this task under (ii') a standard cryptographic assumption.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/25/2018

Adversarial examples from computational constraints

Why are classifiers in high dimension vulnerable to "adversarial" pertur...
research
07/14/2020

Adversarial Examples and Metrics

Adversarial examples are a type of attack on machine learning (ML) syste...
research
02/04/2019

Computational Limitations in Robust Classification and Win-Win Results

We continue the study of computational limitations in learning robust cl...
research
05/28/2019

Adversarially Robust Learning Could Leverage Computational Hardness

Over recent years, devising classification algorithms that are robust to...
research
07/24/2021

Detecting Adversarial Examples Is (Nearly) As Hard As Classifying Them

Making classifiers robust to adversarial examples is hard. Thus, many de...
research
05/28/2021

Towards optimally abstaining from prediction

A common challenge across all areas of machine learning is that training...
research
09/09/2022

Trigger Warnings: Bootstrapping a Violence Detector for FanFiction

We present the first dataset and evaluation results on a newly defined c...

Please sign up or login with your details

Forgot password? Click here to reset