Efficiently Learning Adversarially Robust Halfspaces with Noise

05/15/2020
by   Omar Montasser, et al.
0

We study the problem of learning adversarially robust halfspaces in the distribution-independent setting. In the realizable setting, we provide necessary and sufficient conditions on the adversarial perturbation sets under which halfspaces are efficiently robustly learnable. In the presence of random label noise, we give a simple computationally efficient algorithm for this problem with respect to any ℓ_p-perturbation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/13/2023

Robustly Learning a Single Neuron via Sharpness

We study the problem of learning a single neuron with respect to the L_2...
research
02/03/2021

Adversarially Robust Learning with Unknown Perturbation Sets

We study the problem of learning predictors that are robust to adversari...
research
02/03/2021

Outlier-Robust Learning of Ising Models Under Dobrushin's Condition

We study the problem of learning Ising models satisfying Dobrushin's con...
research
02/13/2020

Learning Halfspaces with Massart Noise Under Structured Distributions

We study the problem of learning halfspaces with Massart noise in the di...
research
11/26/2019

Robustly Clustering a Mixture of Gaussians

We give an efficient algorithm for robustly clustering of a mixture of a...
research
09/10/2021

ReLU Regression with Massart Noise

We study the fundamental problem of ReLU regression, where the goal is t...
research
06/21/2015

Communication Efficient Distributed Agnostic Boosting

We consider the problem of learning from distributed data in the agnosti...

Please sign up or login with your details

Forgot password? Click here to reset