Minimum-Margin Active Learning

05/31/2019
by   Heinrich Jiang, et al.
0

We present a new active sampling method we call min-margin which trains multiple learners on bootstrap samples and then chooses the examples to label based on the candidates' minimum margin amongst the bootstrapped models. This extends standard margin sampling in a way that increases its diversity in a supervised manner as it arises from the model uncertainty. We focus on the one-shot batch active learning setting, and show theoretically and through extensive experiments on a broad set of problems that min-margin outperforms other methods, particularly as batch size grows.

READ FULL TEXT

page 4

page 15

research
07/29/2021

Batch Active Learning at Scale

The ability to train complex and highly effective models often requires ...
research
11/15/2022

MEAL: Stable and Active Learning for Few-Shot Prompting

Few-shot classification in NLP has recently made great strides due to th...
research
06/09/2019

Deep Batch Active Learning by Diverse, Uncertain Gradient Lower Bounds

We design a new algorithm for batch active learning with deep neural net...
research
12/07/2011

Active Learning of Halfspaces under a Margin Assumption

We derive and analyze a new, efficient, pool-based active learning algor...
research
03/11/2022

Learning Distinctive Margin toward Active Domain Adaptation

Despite plenty of efforts focusing on improving the domain adaptation ab...
research
09/04/2019

Augmented Memory Networks for Streaming-Based Active One-Shot Learning

One of the major challenges in training deep architectures for predictiv...
research
04/28/2021

Diversity-Aware Batch Active Learning for Dependency Parsing

While the predictive performance of modern statistical dependency parser...

Please sign up or login with your details

Forgot password? Click here to reset