Curbing Task Interference using Representation Similarity-Guided Multi-Task Feature Sharing

08/19/2022
by   Naresh Kumar Gurulingan, et al.
17

Multi-task learning of dense prediction tasks, by sharing both the encoder and decoder, as opposed to sharing only the encoder, provides an attractive front to increase both accuracy and computational efficiency. When the tasks are similar, sharing the decoder serves as an additional inductive bias providing more room for tasks to share complementary information among themselves. However, increased sharing exposes more parameters to task interference which likely hinders both generalization and robustness. Effective ways to curb this interference while exploiting the inductive bias of sharing the decoder remains an open challenge. To address this challenge, we propose Progressive Decoder Fusion (PDF) to progressively combine task decoders based on inter-task representation similarity. We show that this procedure leads to a multi-task network with better generalization to in-distribution and out-of-distribution data and improved robustness to adversarial attacks. Additionally, we observe that the predictions of different tasks of this multi-task network are more consistent with each other.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/20/2023

Dynamic Gradient Balancing for Enhanced Adversarial Attacks on Multi-Task Models

Multi-task learning (MTL) creates a single machine learning model called...
research
01/19/2020

Gradient Surgery for Multi-Task Learning

While deep learning and deep reinforcement learning (RL) systems have de...
research
06/07/2023

Sample-Level Weighting for Multi-Task Learning with Auxiliary Tasks

Multi-task learning (MTL) can improve the generalization performance of ...
research
11/20/2021

Safe Multi-Task Learning

In recent years, Multi-Task Learning (MTL) attracts much attention due t...
research
04/14/2022

Leveraging convergence behavior to balance conflicting tasks in multi-task learning

Multi-Task Learning is a learning paradigm that uses correlated tasks to...
research
04/30/2023

Multi-Task Structural Learning using Local Task Similarity induced Neuron Creation and Removal

Multi-task learning has the potential to improve generalization by maxim...
research
04/05/2019

Learning Task Relatedness in Multi-Task Learning for Images in Context

Multimedia applications often require concurrent solutions to multiple t...

Please sign up or login with your details

Forgot password? Click here to reset