How transferable are the datasets collected by active learners?

07/12/2018
by   David Lowell, et al.
4

Active learning is a widely-used training strategy for maximizing predictive performance subject to a fixed annotation budget. Between rounds of training, an active learner iteratively selects examples for annotation, typically based on some measure of the model's uncertainty, coupling the acquired dataset with the underlying model. However, owing to the high cost of annotation and the rapid pace of model development, labeled datasets may remain valuable long after a particular model is surpassed by new technology. In this paper, we investigate the transferability of datasets collected with an acquisition model A to a distinct successor model S. We seek to characterize whether the benefits of active learning persist when A and S are different models. To this end, we consider two standard NLP tasks and associated datasets: text classification and sequence tagging. We find that training S on a dataset actively acquired with a (different) model A typically yields worse performance than when S is trained with "native" data (i.e., acquired actively using S), and often performs worse than training on i.i.d. sampled data. These findings have implications for the use of active learning in practice,suggesting that it is better suited to cases where models are updated no more frequently than labeled data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/11/2021

Active^2 Learning: Actively reducing redundancies in Active Learning methods for Sequence Tagging and Machine Translation

While deep learning is a powerful tool for natural language processing (...
research
05/16/2023

On Dataset Transferability in Active Learning for Transformers

Active learning (AL) aims to reduce labeling costs by querying the examp...
research
10/19/2012

Budgeted Learning of Naive-Bayes Classifiers

Frequently, acquiring training data has an associated cost. We consider ...
research
12/13/2021

Depth Uncertainty Networks for Active Learning

In active learning, the size and complexity of the training dataset chan...
research
11/01/2019

Active Learning with Siamese Twins for Sequence Tagging

Deep learning, in general, and natural language processing methods, in p...
research
11/08/2019

Char-RNN and Active Learning for Hashtag Segmentation

We explore the abilities of character recurrent neural network (char-RNN...
research
04/23/2015

Analysis of Stopping Active Learning based on Stabilizing Predictions

Within the natural language processing (NLP) community, active learning ...

Please sign up or login with your details

Forgot password? Click here to reset