Learning Halfspaces with Massart Noise Under Structured Distributions

by   Ilias Diakonikolas, et al.

We study the problem of learning halfspaces with Massart noise in the distribution-specific PAC model. We give the first computationally efficient algorithm for this problem with respect to a broad family of distributions, including log-concave distributions. This resolves an open question posed in a number of prior works. Our approach is extremely simple: We identify a smooth non-convex surrogate loss with the property that any approximate stationary point of this loss defines a halfspace that is close to the target halfspace. Given this structural result, we can use SGD to solve the underlying learning problem.


page 1

page 2

page 3

page 4


Non-Convex SGD Learns Halfspaces with Adversarial Label Noise

We study the problem of agnostically learning homogeneous halfspaces in ...

Learning Halfspaces with Tsybakov Noise

We study the efficient PAC learnability of halfspaces in the presence of...

Learning a Single Neuron with Adversarial Label Noise via Gradient Descent

We study the fundamental problem of learning a single neuron, i.e., a fu...

Efficiently Learning Adversarially Robust Halfspaces with Noise

We study the problem of learning adversarially robust halfspaces in the ...

A Polynomial Time Algorithm for Learning Halfspaces with Tsybakov Noise

We study the problem of PAC learning homogeneous halfspaces in the prese...

Distribution-Independent PAC Learning of Halfspaces with Massart Noise

We study the problem of distribution-independent PAC learning of halfsp...

S-Concave Distributions: Towards Broader Distributions for Noise-Tolerant and Sample-Efficient Learning Algorithms

We provide new results concerning noise-tolerant and sample-efficient le...