Gumbel-Matrix Routing for Flexible Multi-task Learning

10/10/2019
by   Krzysztof Maziarz, et al.
0

This paper proposes a novel per-task routing method for multi-task applications. Multi-task neural networks can learn to transfer knowledge across different tasks by using parameter sharing. However, sharing parameters between unrelated tasks can hurt performance. To address this issue, we advocate the use of routing networks to learn flexible parameter sharing, where each group of parameters is shared with a different subset of tasks in order to better leverage tasks relatedness. At the same time, it is known that routing networks are notoriously hard to train. We propose the Gumbel-Matrix routing: a novel multi-task routing method, designed to learn fine-grained patterns of parameter sharing. The routing is learned jointly with the model parameters by standard back-propagation thanks to the Gumbel-Softmax trick. When applied to the Omniglot benchmark, the proposed method reduces the state-of-the-art error rate by 17

READ FULL TEXT
research
11/03/2017

Routing Networks: Adaptive Selection of Non-linear Functions for Multi-Task Learning

Multi-task learning (MTL) with neural networks leverages commonalities i...
research
03/28/2019

Many Task Learning with Task Routing

Typical multi-task learning (MTL) methods rely on architectural adjustme...
research
10/21/2022

PaCo: Parameter-Compositional Multi-Task Reinforcement Learning

The purpose of multi-task reinforcement learning (MTRL) is to train a si...
research
03/30/2020

Multi-Task Reinforcement Learning with Soft Modularization

Multi-task learning is a very challenging problem in reinforcement learn...
research
09/01/2020

Boosting share routing for multi-task learning

Multi-task learning (MTL) aims to make full use of the knowledge contain...
research
04/29/2020

Task-Feature Collaborative Learning with Application to Personalized Attribute Prediction

As an effective learning paradigm against insufficient training samples,...
research
11/07/2022

Multi-Head Adapter Routing for Data-Efficient Fine-Tuning

Parameter-efficient fine-tuning (PEFT) methods can adapt large language ...

Please sign up or login with your details

Forgot password? Click here to reset