Multi-task learning for natural language processing in the 2020s: where are we going?

07/22/2020
by   Joseph Worsham, et al.
0

Multi-task learning (MTL) significantly pre-dates the deep learning era, and it has seen a resurgence in the past few years as researchers have been applying MTL to deep learning solutions for natural language tasks. While steady MTL research has always been present, there is a growing interest driven by the impressive successes published in the related fields of transfer learning and pre-training, such as BERT, and the release of new challenge problems, such as GLUE and the NLP Decathlon (decaNLP). These efforts place more focus on how weights are shared across networks, evaluate the re-usability of network components and identify use cases where MTL can significantly outperform single-task solutions. This paper strives to provide a comprehensive survey of the numerous recent MTL contributions to the field of natural language processing and provide a forum to focus efforts on the hardest unsolved problems in the next decade. While novel models that improve performance on NLP benchmarks are continually produced, lasting MTL challenges remain unsolved which could hold the key to better language understanding, knowledge discovery and natural language interfaces.

READ FULL TEXT
research
04/07/2022

A Survey of Multi-task Learning in Natural Language Processing: Regarding Task Relatedness and Training Methods

Multi-task learning (MTL) has become increasingly popular in natural lan...
research
09/19/2021

Multi-Task Learning in Natural Language Processing: An Overview

Deep learning approaches have achieved great success in the field of Nat...
research
05/29/2020

Massive Choice, Ample Tasks (MaChAmp):A Toolkit for Multi-task Learning in NLP

Transfer learning, particularly approaches that combine multi-task learn...
research
08/11/2022

Searching for chromate replacements using natural language processing and machine learning algorithms

The past few years has seen the application of machine learning utilised...
research
08/16/2023

Challenges and Opportunities of Using Transformer-Based Multi-Task Learning in NLP Through ML Lifecycle: A Survey

The increasing adoption of natural language processing (NLP) models acro...
research
08/16/2021

Deep Natural Language Processing for LinkedIn Search

Many search systems work with large amounts of natural language data, e....
research
07/18/2022

BERT: A Review of Natural Language Processing and Understanding Applications

We cover the use of BERT, one of the most well-liked deep learning-based...

Please sign up or login with your details

Forgot password? Click here to reset