Noise-adaptive Margin-based Active Learning and Lower Bounds under Tsybakov Noise Condition

06/20/2014
by   Yining Wang, et al.
0

We present a simple noise-robust margin-based active learning algorithm to find homogeneous (passing the origin) linear separators and analyze its error convergence when labels are corrupted by noise. We show that when the imposed noise satisfies the Tsybakov low noise condition (Mammen, Tsybakov, and others 1999; Tsybakov 2004) the algorithm is able to adapt to unknown level of noise and achieves optimal statistical rate up to poly-logarithmic factors. We also derive lower bounds for margin based active learning algorithms under Tsybakov noise conditions (TNC) for the membership query synthesis scenario (Angluin 1988). Our result implies lower bounds for the stream based selective sampling scenario (Cohn 1990) under TNC for some fairly simple data distributions. Quite surprisingly, we show that the sample complexity cannot be improved even if the underlying data distribution is as simple as the uniform distribution on the unit ball. Our proof involves the construction of a well separated hypothesis set on the d-dimensional unit ball along with carefully designed label distributions for the Tsybakov noise condition. Our analysis might provide insights for other forms of lower bounds as well.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/30/2016

Active Learning from Imperfect Labelers

We study active learning where the labeler can not only return incorrect...
research
12/22/2015

Refined Error Bounds for Several Learning Algorithms

This article studies the achievable guarantees on the error rates of cer...
research
02/10/2021

Improved Algorithms for Efficient Active Learning Halfspaces with Massart and Tsybakov noise

We develop a computationally-efficient PAC active learning algorithm for...
research
04/05/2012

Distribution-Dependent Sample Complexity of Large Margin Learning

We obtain a tight distribution-specific characterization of the sample c...
research
11/06/2012

Active and passive learning of linear separators under log-concave distributions

We provide new results concerning label efficient, polynomial time, pass...
research
05/15/2015

An Analysis of Active Learning With Uniform Feature Noise

In active learning, the user sequentially chooses values for feature X a...
research
10/11/2019

Not All are Made Equal: Consistency of Weighted Averaging Estimators Under Active Learning

Active learning seeks to build the best possible model with a budget of ...

Please sign up or login with your details

Forgot password? Click here to reset