The Complexity of Adversarially Robust Proper Learning of Halfspaces with Agnostic Noise

07/30/2020
by   Ilias Diakonikolas, et al.
1

We study the computational complexity of adversarially robust proper learning of halfspaces in the distribution-independent agnostic PAC model, with a focus on L_p perturbations. We give a computationally efficient learning algorithm and a nearly matching computational hardness result for this problem. An interesting implication of our findings is that the L_∞ perturbations case is provably computationally harder than the case 2 ≤ p < ∞.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/10/2021

Agnostic Proper Learning of Halfspaces under Gaussian Marginals

We study the problem of agnostically learning halfspaces under the Gauss...
research
08/29/2019

Nearly Tight Bounds for Robust Proper Learning of Halfspaces with a Margin

We study the problem of properly learning large margin halfspaces in th...
research
06/23/2016

Robust Learning of Fixed-Structure Bayesian Networks

We investigate the problem of learning Bayesian networks in an agnostic ...
research
11/10/2022

Probabilistically Robust PAC Learning

Recently, Robey et al. propose a notion of probabilistic robustness, whi...
research
11/06/2012

Active and passive learning of linear separators under log-concave distributions

We provide new results concerning label efficient, polynomial time, pass...
research
06/11/2020

Learning Halfspaces with Tsybakov Noise

We study the efficient PAC learnability of halfspaces in the presence of...
research
06/21/2015

Communication Efficient Distributed Agnostic Boosting

We consider the problem of learning from distributed data in the agnosti...

Please sign up or login with your details

Forgot password? Click here to reset