An Overview of Multi-Task Learning in Deep Neural Networks

06/15/2017
by   Sebastian Ruder, et al.
0

Multi-task learning (MTL) has led to successes in many applications of machine learning, from natural language processing and speech recognition to computer vision and drug discovery. This article aims to give a general overview of MTL, particularly in deep neural networks. It introduces the two most common methods for MTL in Deep Learning, gives an overview of the literature, and discusses recent advances. In particular, it seeks to help ML practitioners apply MTL by shedding light on how MTL works and providing guidelines for choosing appropriate auxiliary tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/07/2022

A Survey of Multi-task Learning in Natural Language Processing: Regarding Task Relatedness and Training Methods

Multi-task learning (MTL) has become increasingly popular in natural lan...
research
08/03/2018

Generalization Error in Deep Learning

Deep learning models have lately shown great performance in various fiel...
research
11/13/2019

The Deep Learning Revolution and Its Implications for Computer Architecture and Chip Design

The past decade has seen a remarkable series of advances in machine lear...
research
04/21/2020

Neural forecasting: Introduction and literature overview

Neural network based forecasting methods have become ubiquitous in large...
research
04/28/2020

Revisiting Multi-Task Learning in the Deep Learning Era

Despite the recent progress in deep learning, most approaches still go f...
research
11/08/2018

Transformative Machine Learning

The key to success in machine learning (ML) is the use of effective data...
research
01/28/2021

Copula-based conformal prediction for Multi-Target Regression

There are relatively few works dealing with conformal prediction for mul...

Please sign up or login with your details

Forgot password? Click here to reset