On the Power of Localized Perceptron for Label-Optimal Learning of Halfspaces with Adversarial Noise

12/19/2020
by   Jie Shen, et al.
0

We study online active learning of homogeneous halfspaces in ℝ^d with adversarial noise where the overall probability of a noisy label is constrained to be at most ν. Our main contribution is a Perceptron-like online active learning algorithm that runs in polynomial time, and under the conditions that the marginal distribution is isotropic log-concave and ν = Ω(ϵ), where ϵ∈ (0, 1) is the target error rate, our algorithm PAC learns the underlying halfspace with near-optimal label complexity of Õ(d · polylog(1/ϵ)) and sample complexity of Õ(d/ϵ). Prior to this work, existing online algorithms designed for tolerating the adversarial noise are subject to either label complexity polynomial in 1/ϵ, or suboptimal noise tolerance, or restrictive marginal distributions. With the additional prior knowledge that the underlying halfspace is s-sparse, we obtain attribute-efficient label complexity of Õ( s · polylog(d, 1/ϵ) ) and sample complexity of Õ(s/ϵ· polylog(d) ). As an immediate corollary, we show that under the agnostic model where no assumption is made on the noise rate ν, our active learner achieves an error rate of O(OPT) + ϵ with the same running time and label and sample complexity, where OPT is the best possible error rate achievable by any homogeneous halfspace.

READ FULL TEXT

Authors

page 1

page 2

page 3

page 4

07/31/2013

The Power of Localization for Efficiently Learning Linear Separators with Noise

We introduce a new approach for designing computationally efficient lear...
02/18/2017

Revisiting Perceptron: Efficient and Label-Optimal Learning of Halfspaces

It has been a long-standing problem to efficiently learn a halfspace usi...
06/17/2022

Learning a Single Neuron with Adversarial Label Noise via Gradient Descent

We study the fundamental problem of learning a single neuron, i.e., a fu...
08/19/2021

Threshold Phenomena in Learning Halfspaces with Massart Noise

We study the problem of PAC learning halfspaces on ℝ^d with Massart nois...
04/19/2019

Online Active Learning: Label Complexity vs. Classification Errors

We study online active learning for classifying streaming instances. At ...
09/28/2018

Target-Independent Active Learning via Distribution-Splitting

To reduce the label complexity in Agnostic Active Learning (A^2 algorith...
02/12/2020

Efficient active learning of sparse halfspaces with arbitrary bounded noise

In this work we study active learning of homogeneous s-sparse halfspaces...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.