Learning Halfspaces with Tsybakov Noise

06/11/2020
by   Ilias Diakonikolas, et al.
0

We study the efficient PAC learnability of halfspaces in the presence of Tsybakov noise. In the Tsybakov noise model, each label is independently flipped with some probability which is controlled by an adversary. This noise model significantly generalizes the Massart noise model, by allowing the flipping probabilities to be arbitrarily close to 1/2 for a fraction of the samples. Our main result is the first non-trivial PAC learning algorithm for this problem under a broad family of structured distributions – satisfying certain concentration and (anti-)anti-concentration properties – including log-concave distributions. Specifically, we given an algorithm that achieves misclassification error ϵ with respect to the true halfspace, with quasi-polynomial runtime dependence in 1/. The only previous upper bound for this problem – even for the special case of log-concave distributions – was doubly exponential in 1/ϵ (and follows via the naive reduction to agnostic learning). Our approach relies on a novel computationally efficient procedure to certify whether a candidate solution is near-optimal, based on semi-definite programming. We use this certificate procedure as a black-box and turn it into an efficient learning algorithm by searching over the space of halfspaces via online convex optimization.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/04/2020

A Polynomial Time Algorithm for Learning Halfspaces with Tsybakov Noise

We study the problem of PAC learning homogeneous halfspaces in the prese...
research
02/03/2021

Outlier-Robust Learning of Ising Models Under Dobrushin's Condition

We study the problem of learning Ising models satisfying Dobrushin's con...
research
02/13/2020

Learning Halfspaces with Massart Noise Under Structured Distributions

We study the problem of learning halfspaces with Massart noise in the di...
research
06/11/2020

Non-Convex SGD Learns Halfspaces with Adversarial Label Noise

We study the problem of agnostically learning homogeneous halfspaces in ...
research
09/10/2021

ReLU Regression with Massart Noise

We study the fundamental problem of ReLU regression, where the goal is t...
research
10/18/2020

Robust Learning under Strong Noise via SQs

This work provides several new insights on the robustness of Kearns' sta...
research
07/30/2020

The Complexity of Adversarially Robust Proper Learning of Halfspaces with Agnostic Noise

We study the computational complexity of adversarially robust proper lea...

Please sign up or login with your details

Forgot password? Click here to reset