ALLWAS: Active Learning on Language models in WASserstein space

09/03/2021
by   Anson Bastos, et al.
0

Active learning has emerged as a standard paradigm in areas with scarcity of labeled training data, such as in the medical domain. Language models have emerged as the prevalent choice of several natural language tasks due to the performance boost offered by these models. However, in several domains, such as medicine, the scarcity of labeled training data is a common issue. Also, these models may not work well in cases where class imbalance is prevalent. Active learning may prove helpful in these cases to boost the performance with a limited label budget. To this end, we propose a novel method using sampling techniques based on submodular optimization and optimal transport for active learning in language models, dubbed ALLWAS. We construct a sampling strategy based on submodular optimization of the designed objective in the gradient domain. Furthermore, to enable learning from few samples, we propose a novel strategy for sampling from the Wasserstein barycenters. Our empirical evaluations on standard benchmark datasets for text classification show that our methods perform significantly better (>20 than existing approaches for active learning on language models.

READ FULL TEXT
research
09/09/2019

Learning to Sample: an Active Learning Framework

Meta-learning algorithms for active learning are emerging as a promising...
research
06/27/2023

Large Language Models as Annotators: Enhancing Generalization of NLP Models at Minimal Cost

State-of-the-art supervised NLP models achieve high accuracy but are als...
research
11/07/2022

AfroLM: A Self-Active Learning-based Multilingual Pretrained Language Model for 23 African Languages

In recent years, multilingual pre-trained language models have gained pr...
research
10/19/2020

Cold-start Active Learning through Self-supervised Language Modeling

Active learning strives to reduce annotation costs by choosing the most ...
research
11/15/2022

An Efficient Active Learning Pipeline for Legal Text Classification

Active Learning (AL) is a powerful tool for learning with less labeled d...
research
06/05/2023

Deep Active Learning with Structured Neural Depth Search

Previous work optimizes traditional active learning (AL) processes with ...
research
08/18/2022

Active PETs: Active Data Annotation Prioritisation for Few-Shot Claim Verification with Pattern Exploiting Training

To mitigate the impact of data scarcity on fact-checking systems, we foc...

Please sign up or login with your details

Forgot password? Click here to reset