On the Power of Localized Perceptron for Label-Optimal Learning of Halfspaces with Adversarial Noise

12/19/2020
by   Jie Shen, et al.
0

We study online active learning of homogeneous halfspaces in ℝ^d with adversarial noise where the overall probability of a noisy label is constrained to be at most ν. Our main contribution is a Perceptron-like online active learning algorithm that runs in polynomial time, and under the conditions that the marginal distribution is isotropic log-concave and ν = Ω(ϵ), where ϵ∈ (0, 1) is the target error rate, our algorithm PAC learns the underlying halfspace with near-optimal label complexity of Õ(d · polylog(1/ϵ)) and sample complexity of Õ(d/ϵ). Prior to this work, existing online algorithms designed for tolerating the adversarial noise are subject to either label complexity polynomial in 1/ϵ, or suboptimal noise tolerance, or restrictive marginal distributions. With the additional prior knowledge that the underlying halfspace is s-sparse, we obtain attribute-efficient label complexity of Õ( s · polylog(d, 1/ϵ) ) and sample complexity of Õ(s/ϵ· polylog(d) ). As an immediate corollary, we show that under the agnostic model where no assumption is made on the noise rate ν, our active learner achieves an error rate of O(OPT) + ϵ with the same running time and label and sample complexity, where OPT is the best possible error rate achievable by any homogeneous halfspace.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset