Challenges and Opportunities of Using Transformer-Based Multi-Task Learning in NLP Through ML Lifecycle: A Survey

08/16/2023
by   Lovre Torbarina, et al.
0

The increasing adoption of natural language processing (NLP) models across industries has led to practitioners' need for machine learning systems to handle these models efficiently, from training to serving them in production. However, training, deploying, and updating multiple models can be complex, costly, and time-consuming, mainly when using transformer-based pre-trained language models. Multi-Task Learning (MTL) has emerged as a promising approach to improve efficiency and performance through joint training, rather than training separate models. Motivated by this, we first provide an overview of transformer-based MTL approaches in NLP. Then, we discuss the challenges and opportunities of using MTL approaches throughout typical ML lifecycle phases, specifically focusing on the challenges related to data engineering, model development, deployment, and monitoring phases. This survey focuses on transformer-based MTL architectures and, to the best of our knowledge, is novel in that it systematically analyses how transformer-based MTL in NLP fits into ML lifecycle phases. Furthermore, we motivate research on the connection between MTL and continual learning (CL), as this area remains unexplored. We believe it would be practical to have a model that can handle both MTL and CL, as this would make it easier to periodically re-train the model, update it due to distribution shifts, and add new capabilities to meet real-world requirements.

READ FULL TEXT
research
04/07/2022

A Survey of Multi-task Learning in Natural Language Processing: Regarding Task Relatedness and Training Methods

Multi-task learning (MTL) has become increasingly popular in natural lan...
research
09/19/2021

Multi-Task Learning in Natural Language Processing: An Overview

Deep learning approaches have achieved great success in the field of Nat...
research
07/22/2020

Multi-task learning for natural language processing in the 2020s: where are we going?

Multi-task learning (MTL) significantly pre-dates the deep learning era,...
research
08/10/2022

Multi-task Active Learning for Pre-trained Transformer-based Models

Multi-task learning, in which several tasks are jointly learned by a sin...
research
06/06/2021

ModelCI-e: Enabling Continual Learning in Deep Learning Serving Systems

MLOps is about taking experimental ML models to production, i.e., servin...
research
09/14/2021

The Stem Cell Hypothesis: Dilemma behind Multi-Task Learning with Transformer Encoders

Multi-task learning with transformer encoders (MTL) has emerged as a pow...
research
07/12/2022

A new hope for network model generalization

Generalizing machine learning (ML) models for network traffic dynamics t...

Please sign up or login with your details

Forgot password? Click here to reset