DeepAI
Log In Sign Up

Learning in Confusion: Batch Active Learning with Noisy Oracle

We study the problem of training machine learning models incrementally using active learning with access to imperfect or noisy oracles. We specifically consider the setting of batch active learning, in which multiple samples are selected as opposed to a single sample as in classical settings so as to reduce the training overhead. Our approach bridges between uniform randomness and score based importance sampling of clusters when selecting a batch of new samples. Experiments on benchmark image classification datasets (MNIST, SVHN, and CIFAR10) shows improvement over existing active learning strategies. We introduce an extra denoising layer to deep networks to make active learning robust to label noises and show significant improvements.

READ FULL TEXT

page 1

page 2

page 3

page 4

01/17/2019

Diverse mini-batch Active Learning

We study the problem of reducing the amount of labeled training data req...
07/29/2021

Batch Active Learning at Scale

The ability to train complex and highly effective models often requires ...
06/28/2019

The Practical Challenges of Active Learning: Lessons Learned from Live Experimentation

We tested in a live setting the use of active learning for selecting tex...
10/24/2022

Active Learning for Single Neuron Models with Lipschitz Non-Linearities

We consider the problem of active learning for single neuron models, als...
06/13/2022

On the reusability of samples in active learning

An interesting but not extensively studied question in active learning i...
02/23/2016

Search Improves Label for Active Learning

We investigate active learning with access to two distinct oracles: Labe...
05/31/2019

Minimum-Margin Active Learning

We present a new active sampling method we call min-margin which trains ...