Sample Complexity Bounds for Robustly Learning Decision Lists against Evasion Attacks

05/12/2022
by   Pascale Gourdeau, et al.
0

A fundamental problem in adversarial machine learning is to quantify how much training data is needed in the presence of evasion attacks. In this paper we address this issue within the framework of PAC learning, focusing on the class of decision lists. Given that distributional assumptions are essential in the adversarial setting, we work with probability distributions on the input data that satisfy a Lipschitz condition: nearby points have similar probability. Our key results illustrate that the adversary's budget (that is, the number of bits it can perturb on each input) is a fundamental quantity in determining the sample complexity of robust learning. Our first main result is a sample-complexity lower bound: the class of monotone conjunctions (essentially the simplest non-trivial hypothesis class on the Boolean hypercube) and any superclass has sample complexity at least exponential in the adversary's budget. Our second main result is a corresponding upper bound: for every fixed k the class of k-decision lists has polynomial sample complexity against a log(n)-bounded adversary. This sheds further light on the question of whether an efficient PAC learning algorithm can always be used as an efficient log(n)-robust learning algorithm under the uniform distribution.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/23/2023

Sample Complexity of Robust Learning against Evasion Attacks

It is becoming increasingly important to understand the vulnerability of...
research
09/12/2019

On the Hardness of Robust Classification

It is becoming increasingly important to understand the vulnerability of...
research
12/21/2022

A Theoretical Study of The Effects of Adversarial Attacks on Sparse Regression

This paper analyzes ℓ_1 regularized linear regression under the challeng...
research
02/24/2020

On the Sample Complexity of Adversarial Multi-Source PAC Learning

We study the problem of learning from multiple untrusted data sources, a...
research
02/26/2020

Decidability of Sample Complexity of PAC Learning in finite setting

In this short note we observe that the sample complexity of PAC machine ...
research
06/05/2018

PAC-learning in the presence of evasion adversaries

The existence of evasion attacks during the test phase of machine learni...
research
05/28/2023

On the Role of Noise in the Sample Complexity of Learning Recurrent Neural Networks: Exponential Gaps for Long Sequences

We consider the class of noisy multi-layered sigmoid recurrent neural ne...

Please sign up or login with your details

Forgot password? Click here to reset