Adversarial Robustness: What fools you makes you stronger

02/10/2021
by   Grzegorz Głuch, et al.
0

We prove an exponential separation for the sample complexity between the standard PAC-learning model and a version of the Equivalence-Query-learning model. We then show that this separation has interesting implications for adversarial robustness. We explore a vision of designing an adaptive defense that in the presence of an attacker computes a model that is provably robust. In particular, we show how to realize this vision in a simplified setting. In order to do so, we introduce a notion of a strong adversary: he is not limited by the type of perturbations he can apply but when presented with a classifier can repetitively generate different adversarial examples. We explain why this notion is interesting to study and use it to prove the following. There exists an efficient adversarial-learning-like scheme such that for every strong adversary 𝐀 it outputs a classifier that (a) cannot be strongly attacked by 𝐀, or (b) has error at most ϵ. In both cases our scheme uses exponentially (in ϵ) fewer samples than what the PAC bound requires.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/13/2019

Lower Bounds for Adversarially Robust PAC Learning

In this work, we initiate a formal study of probably approximately corre...
research
06/05/2018

PAC-learning in the presence of evasion adversaries

The existence of evasion attacks during the test phase of machine learni...
research
06/30/2020

Black-box Certification and Learning under Adversarial Perturbations

We formally study the problem of classification under adversarial pertur...
research
04/10/2020

Luring of Adversarial Perturbations

The growing interest for adversarial examples, i.e. maliciously modified...
research
02/19/2020

Quantum statistical query learning

We propose a learning model called the quantum statistical learning QSQ ...
research
08/23/2023

Sample Complexity of Robust Learning against Evasion Attacks

It is becoming increasingly important to understand the vulnerability of...

Please sign up or login with your details

Forgot password? Click here to reset