An Analysis of Active Learning With Uniform Feature Noise

05/15/2015
by   Aaditya Ramdas, et al.
0

In active learning, the user sequentially chooses values for feature X and an oracle returns the corresponding label Y. In this paper, we consider the effect of feature noise in active learning, which could arise either because X itself is being measured, or it is corrupted in transmission to the oracle, or the oracle returns the label of a noisy version of the query point. In statistics, feature noise is known as "errors in variables" and has been studied extensively in non-active settings. However, the effect of feature noise in active learning has not been studied before. We consider the well-known Berkson errors-in-variables model with additive uniform noise of width σ. Our simple but revealing setting is that of one-dimensional binary classification setting where the goal is to learn a threshold (point where the probability of a + label crosses half). We deal with regression functions that are antisymmetric in a region of size σ around the threshold and also satisfy Tsybakov's margin condition around the threshold. We prove minimax lower and upper bounds which demonstrate that when σ is smaller than the minimiax active/passive noiseless error derived in CN07, then noise has no effect on the rates and one achieves the same noiseless rates. For larger σ, the unflattening of the regression function on convolution with uniform noise, along with its local antisymmetry around the threshold, together yield a behaviour where noise appears to be beneficial. Our key result is that active learning can buy significant improvement over a passive strategy even in the presence of feature noise.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/03/2014

Minimax Analysis of Active Learning

This work establishes distribution-free upper and lower bounds on the mi...
research
08/08/2011

Activized Learning: Transforming Passive to Active with Improved Label Complexity

We study the theoretical advantages of active learning over passive lear...
research
11/20/2018

Computer-Assisted Fraud Detection, From Active Learning to Reward Maximization

The automatic detection of frauds in banking transactions has been recen...
research
10/16/2021

Nuances in Margin Conditions Determine Gains in Active Learning

We consider nonparametric classification with smooth regression function...
research
03/16/2017

Adaptivity to Noise Parameters in Nonparametric Active Learning

This work addresses various open questions in the theory of active learn...
research
06/03/2022

Indirect Active Learning

Traditional models of active learning assume a learner can directly mani...
research
06/20/2014

Noise-adaptive Margin-based Active Learning and Lower Bounds under Tsybakov Noise Condition

We present a simple noise-robust margin-based active learning algorithm ...

Please sign up or login with your details

Forgot password? Click here to reset