Multi-task Active Learning for Pre-trained Transformer-based Models

08/10/2022
by   Guy Rotman, et al.
0

Multi-task learning, in which several tasks are jointly learned by a single model, allows NLP models to share information from multiple annotations and may facilitate better predictions when the tasks are inter-related. This technique, however, requires annotating the same text with multiple annotation schemes which may be costly and laborious. Active learning (AL) has been demonstrated to optimize annotation processes by iteratively selecting unlabeled examples whose annotation is most valuable for the NLP model. Yet, multi-task active learning (MT-AL) has not been applied to state-of-the-art pre-trained Transformer-based NLP models. This paper aims to close this gap. We explore various multi-task selection criteria in three realistic multi-task scenarios, reflecting different relations between the participating tasks, and demonstrate the effectiveness of multi-task compared to single-task selection. Our results suggest that MT-AL can be effectively used in order to minimize annotation efforts for multi-task NLP models.

READ FULL TEXT
research
11/21/2022

PartAL: Efficient Partial Active Learning in Multi-Task Visual Settings

Multi-task learning is central to many real-world applications. Unfortun...
research
06/21/2023

Multi-Task Consistency for Active Learning

Learning-based solutions for vision tasks require a large amount of labe...
research
05/23/2023

EASE: An Easily-Customized Annotation System Powered by Efficiency Enhancement Mechanisms

The performance of current supervised AI systems is tightly connected to...
research
09/13/2021

GradTS: A Gradient-Based Automatic Auxiliary Task Selection Method Based on Transformer Networks

A key problem in multi-task learning (MTL) research is how to select hig...
research
08/16/2018

The DALPHI annotation framework & how its pre-annotations can improve annotator efficiency

Producing the required amounts of training data for machine learning and...
research
08/16/2023

Challenges and Opportunities of Using Transformer-Based Multi-Task Learning in NLP Through ML Lifecycle: A Survey

The increasing adoption of natural language processing (NLP) models acro...
research
11/10/2020

Multi-Task Sequence Prediction For Tunisian Arabizi Multi-Level Annotation

In this paper we propose a multi-task sequence prediction system, based ...

Please sign up or login with your details

Forgot password? Click here to reset