Understanding and Improving Information Transfer in Multi-Task Learning

05/02/2020
by   Sen Wu, et al.
0

We investigate multi-task learning approaches that use a shared feature representation for all tasks. To better understand the transfer of task information, we study an architecture with a shared module for all tasks and a separate output module for each task. We study the theory of this setting on linear and ReLU-activated models. Our key observation is that whether or not tasks' data are well-aligned can significantly affect the performance of multi-task learning. We show that misalignment between task data can cause negative transfer (or hurt performance) and provide sufficient conditions for positive transfer. Inspired by the theoretical insights, we show that aligning tasks' embedding layers leads to performance gains for multi-task training and transfer learning on the GLUE benchmark and sentiment analysis tasks; for example, we obtain a 2.35 BERT-LARGE using our alignment method. We also design an SVD-based task reweighting scheme and show that it improves the robustness of multi-task training on a multi-label image dataset.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/13/2023

Learning from Auxiliary Sources in Argumentative Revision Classification

We develop models to classify desirable reasoning revisions in argumenta...
research
12/13/2022

Do Text-to-Text Multi-Task Learners Suffer from Task Conflict?

Traditional multi-task learning architectures train a single model acros...
research
07/17/2018

A Modulation Module for Multi-task Learning with Applications in Image Retrieval

Multi-task learning has been widely adopted in many computer vision task...
research
04/30/2023

Multi-Task Structural Learning using Local Task Similarity induced Neuron Creation and Removal

Multi-task learning has the potential to improve generalization by maxim...
research
10/29/2018

Multi-label Multi-task Deep Learning for Behavioral Coding

We propose a methodology for estimating human behaviors in psychotherapy...
research
09/28/2018

Using Multi-task and Transfer Learning to Solve Working Memory Tasks

We propose a new architecture called Memory-Augmented Encoder-Solver (MA...
research
12/13/2022

Egocentric Video Task Translation

Different video understanding tasks are typically treated in isolation, ...

Please sign up or login with your details

Forgot password? Click here to reset