The Stem Cell Hypothesis: Dilemma behind Multi-Task Learning with Transformer Encoders

09/14/2021
by   Han He, et al.
19

Multi-task learning with transformer encoders (MTL) has emerged as a powerful technique to improve performance on closely-related tasks for both accuracy and efficiency while a question still remains whether or not it would perform as well on tasks that are distinct in nature. We first present MTL results on five NLP tasks, POS, NER, DEP, CON, and SRL, and depict its deficiency over single-task learning. We then conduct an extensive pruning analysis to show that a certain set of attention heads get claimed by most tasks during MTL, who interfere with one another to fine-tune those heads for their own objectives. Based on this finding, we propose the Stem Cell Hypothesis to reveal the existence of attention heads naturally talented for many tasks that cannot be jointly trained to create adequate embeddings for all of those tasks. Finally, we design novel parameter-free probes to justify our hypothesis and demonstrate how attention heads are transformed across the five tasks during MTL through label analysis.

READ FULL TEXT

page 5

page 21

page 22

page 23

research
01/23/2019

Sentiment and Sarcasm Classification with Multitask Learning

Sentiment classification and sarcasm detection are both important NLP ta...
research
04/11/2018

Multi-Task Learning for Argumentation Mining

We investigate whether and where multi-task learning (MTL) can improve p...
research
05/25/2022

Eliciting Transferability in Multi-task Learning with Task-level Mixture-of-Experts

Recent work suggests that transformer models are capable of multi-task l...
research
08/13/2018

Multi-Task Learning for Sequence Tagging: An Empirical Study

We study three general multi-task learning (MTL) approaches on 11 sequen...
research
05/31/2018

Asymptotic performance of regularized multi-task learning

This paper analyzes asymptotic performance of a regularized multi-task l...
research
10/15/2021

Transformer-based Multi-task Learning for Disaster Tweet Categorisation

Social media has enabled people to circulate information in a timely fas...
research
08/16/2023

Challenges and Opportunities of Using Transformer-Based Multi-Task Learning in NLP Through ML Lifecycle: A Survey

The increasing adoption of natural language processing (NLP) models acro...

Please sign up or login with your details

Forgot password? Click here to reset