Deep Batch Active Learning by Diverse, Uncertain Gradient Lower Bounds

06/09/2019
by   Jordan T. Ash, et al.
3

We design a new algorithm for batch active learning with deep neural network models. Our algorithm, Batch Active learning by Diverse Gradient Embeddings (BADGE), samples groups of points that are disparate and high-magnitude when represented in a hallucinated gradient space, a strategy designed to incorporate both predictive uncertainty and sample diversity into every selected batch. Crucially, BADGE trades off between diversity and uncertainty without requiring any hand-tuned hyperparameters. We show that while other approaches sometimes succeed for particular batch sizes or architectures, BADGE consistently performs as well or better, making it a versatile option for practical active learning problems.

READ FULL TEXT

page 6

page 17

research
03/05/2023

Streaming Active Learning with Deep Neural Networks

Active learning is perhaps most naturally posed as an online learning pr...
research
11/20/2019

Deep Active Learning: Unified and Principled Method for Query and Training

In this paper, we proposed a unified and principled method for both quer...
research
01/17/2019

Diverse mini-batch Active Learning

We study the problem of reducing the amount of labeled training data req...
research
05/31/2019

Minimum-Margin Active Learning

We present a new active sampling method we call min-margin which trains ...
research
04/28/2021

Diversity-Aware Batch Active Learning for Dependency Parsing

While the predictive performance of modern statistical dependency parser...
research
09/17/2012

Submodularity in Batch Active Learning and Survey Problems on Gaussian Random Fields

Many real-world datasets can be represented in the form of a graph whose...
research
07/29/2021

Batch Active Learning at Scale

The ability to train complex and highly effective models often requires ...

Please sign up or login with your details

Forgot password? Click here to reset