Learning Halfspaces with Tsybakov Noise

06/11/2020
by   Ilias Diakonikolas, et al.
0

We study the efficient PAC learnability of halfspaces in the presence of Tsybakov noise. In the Tsybakov noise model, each label is independently flipped with some probability which is controlled by an adversary. This noise model significantly generalizes the Massart noise model, by allowing the flipping probabilities to be arbitrarily close to 1/2 for a fraction of the samples. Our main result is the first non-trivial PAC learning algorithm for this problem under a broad family of structured distributions – satisfying certain concentration and (anti-)anti-concentration properties – including log-concave distributions. Specifically, we given an algorithm that achieves misclassification error ϵ with respect to the true halfspace, with quasi-polynomial runtime dependence in 1/. The only previous upper bound for this problem – even for the special case of log-concave distributions – was doubly exponential in 1/ϵ (and follows via the naive reduction to agnostic learning). Our approach relies on a novel computationally efficient procedure to certify whether a candidate solution is near-optimal, based on semi-definite programming. We use this certificate procedure as a black-box and turn it into an efficient learning algorithm by searching over the space of halfspaces via online convex optimization.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset