Branched Multi-Task Networks: Deciding What Layers To Share

04/05/2019
by   Simon Vandenhende, et al.
14

In the context of deep learning, neural networks with multiple branches have been used that each solve different tasks. Such ramified networks typically start with a number of shared layers, after which different tasks branch out into their own sequence of layers. As the number of possible network configurations is combinatorially large, prior work has often relied on ad hoc methods to determine the level of layer sharing. This work proposes a novel method to assess the relatedness of tasks in a principled way. We base the relatedness of a task pair on the usefulness of a set of features of one task for the other, and vice versa. The resulting task affinities are used for the automated construction of a branched multi-task network in which deeper layers gradually grow more task-specific. Our multi-task network outperforms the state-of-the-art on CelebA. Additionally, the layer sharing schemes devised by our method outperform common multi-task learning models which were constructed ad hoc. We include additional experiments on Cityscapes and SUN RGB-D to illustrate the wide applicability of our approach. Code and trained models for this paper are made available https://github.com/SimonVandenhende/

READ FULL TEXT
research
11/27/2019

AdaShare: Learning What To Share For Efficient Deep Multi-Task Learning

Multi-task learning is an open and challenging problem in computer visio...
research
06/02/2020

Learning to Branch for Multi-Task Learning

Training multiple tasks jointly in one deep network yields reduced laten...
research
01/22/2021

Network Clustering for Multi-task Learning

The Multi-Task Learning (MTL) technique has been widely studied by word-...
research
11/16/2016

Fully-adaptive Feature Sharing in Multi-Task Networks with Applications in Person Attribute Classification

Multi-task learning aims to improve generalization performance of multip...
research
07/23/2021

Rethinking Hard-Parameter Sharing in Multi-Task Learning

Hard parameter sharing in multi-task learning (MTL) allows tasks to shar...
research
07/26/2019

Multi-Stage Prediction Networks for Data Harmonization

In this paper, we introduce multi-task learning (MTL) to data harmonizat...
research
07/06/2020

Meta-Learning Symmetries by Reparameterization

Many successful deep learning architectures are equivariant to certain t...

Please sign up or login with your details

Forgot password? Click here to reset