Learning Halfspaces with Massart Noise Under Structured Distributions

02/13/2020
by   Ilias Diakonikolas, et al.
0

We study the problem of learning halfspaces with Massart noise in the distribution-specific PAC model. We give the first computationally efficient algorithm for this problem with respect to a broad family of distributions, including log-concave distributions. This resolves an open question posed in a number of prior works. Our approach is extremely simple: We identify a smooth non-convex surrogate loss with the property that any approximate stationary point of this loss defines a halfspace that is close to the target halfspace. Given this structural result, we can use SGD to solve the underlying learning problem.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/11/2020

Non-Convex SGD Learns Halfspaces with Adversarial Label Noise

We study the problem of agnostically learning homogeneous halfspaces in ...
research
06/11/2020

Learning Halfspaces with Tsybakov Noise

We study the efficient PAC learnability of halfspaces in the presence of...
research
06/17/2022

Learning a Single Neuron with Adversarial Label Noise via Gradient Descent

We study the fundamental problem of learning a single neuron, i.e., a fu...
research
10/04/2020

A Polynomial Time Algorithm for Learning Halfspaces with Tsybakov Noise

We study the problem of PAC learning homogeneous halfspaces in the prese...
research
05/15/2020

Efficiently Learning Adversarially Robust Halfspaces with Noise

We study the problem of learning adversarially robust halfspaces in the ...
research
06/13/2023

Robustly Learning a Single Neuron via Sharpness

We study the problem of learning a single neuron with respect to the L_2...
research
03/22/2017

S-Concave Distributions: Towards Broader Distributions for Noise-Tolerant and Sample-Efficient Learning Algorithms

We provide new results concerning noise-tolerant and sample-efficient le...

Please sign up or login with your details

Forgot password? Click here to reset