Knowledge Distillation for Multi-task Learning

07/14/2020
by   Wei-Hong Li, et al.
0

Multi-task learning (MTL) is to learn one single model that performs multiple tasks for achieving good performance on all tasks and lower cost on computation. Learning such a model requires to jointly optimize losses of a set of tasks with different difficulty levels, magnitudes, and characteristics (e.g. cross-entropy, Euclidean loss), leading to the imbalance problem in multi-task learning. To address the imbalance problem, we propose a knowledge distillation based method in this work. We first learn a task-specific model for each task. We then learn the multi-task model for minimizing task-specific loss and for producing the same feature with task-specific models. As the task-specific network encodes different features, we introduce small task-specific adaptors to project multi-task features to the task-specific features. In this way, the adaptors align the task-specific feature and the multi-task feature, which enables a balanced parameter sharing across tasks. Extensive experimental results demonstrate that our method can optimize a multi-task learning model in a more balanced way and achieve better overall performance.

READ FULL TEXT

page 5

page 8

research
11/27/2019

AdaShare: Learning What To Share For Efficient Deep Multi-Task Learning

Multi-task learning is an open and challenging problem in computer visio...
research
07/28/2023

TaskExpert: Dynamically Assembling Multi-Task Representations with Memorial Mixture-of-Experts

Learning discriminative task-specific features simultaneously for multip...
research
01/06/2023

Task Aware Feature Extraction Framework for Sequential Dependence Multi-Task Learning

Multi-task learning (MTL) has been successfully implemented in many real...
research
03/28/2018

End-to-End Multi-Task Learning with Attention

In this paper, we propose a novel multi-task learning architecture, whic...
research
09/21/2021

Optimization Strategies in Multi-Task Learning: Averaged or Separated Losses?

In Multi-Task Learning (MTL), it is a common practice to train multi-tas...
research
11/26/2021

WiFi-based Multi-task Sensing

WiFi-based sensing has aroused immense attention over recent years. The ...
research
05/31/2022

Compressed Hierarchical Representations for Multi-Task Learning and Task Clustering

In this paper, we frame homogeneous-feature multi-task learning (MTL) as...

Please sign up or login with your details

Forgot password? Click here to reset