DeepAI AI Chat
Log In Sign Up

Emerging Relation Network and Task Embedding for Multi-Task Regression Problems

04/29/2020
by   Jens Schreiber, et al.
Universität Kassel
0

Multi-task learning (mtl) provides state-of-the-art results in many applications of computer vision and natural language processing. In contrast to single-task learning (stl), mtl allows for leveraging knowledge between related tasks improving prediction results on the main task (in contrast to an auxiliary task) or all tasks. However, there is a limited number of comparative studies on applying mtl architectures for regression and time series problems taking recent advances of mtl into account. An interesting, non-linear problem is the forecast of the expected power generation for renewable power plants. Therefore, this article provides a comparative study of the following recent and important mtl architectures: Hard parameter sharing, cross-stitch network, sluice network (sn). They are compared to a multi-layer perceptron model of similar size in an stl setting. Additionally, we provide a simple, yet effective approach to model task specific information through an embedding layer in an multi-layer perceptron, referred to as task embedding. Further, we introduce a new mtl architecture named emerging relation network (ern), which can be considered as an extension of the sluice network. For a solar power dataset, the task embedding achieves the best mean improvement with 14.9 mean improvement of the ern and the sn on the solar dataset is of similar magnitude with 14.7 a significant improvement of up to 7.7 beneficial when tasks are only loosely related and the prediction problem is more non-linear. Contrary, the proposed task embedding is advantageous when tasks are strongly correlated. Further, the task embedding provides an effective approach with reduced computational effort compared to other mtl architectures.

READ FULL TEXT

page 1

page 5

02/27/2017

Co-evolutionary multi-task learning for dynamic time series prediction

Multi-task learning employs shared representation of knowledge for learn...
02/07/2019

BERT and PALs: Projected Attention Layers for Efficient Adaptation in Multi-Task Learning

Multi-task learning allows the sharing of useful information between mul...
10/28/2017

Multi-Task Learning by Deep Collaboration and Application in Facial Landmark Detection

Convolutional neural networks (CNN) have become the most successful and ...
05/31/2019

Modular Universal Reparameterization: Deep Multi-task Learning Across Diverse Domains

As deep learning applications continue to become more diverse, an intere...
07/27/2020

Multi-Task Learning for Multi-Dimensional Regression: Application to Luminescence Sensing

The classical approach to non-linear regression in physics, is to take a...
08/16/2019

The Partial Response Network

We propose a method to open the black box of the Multi-Layer Perceptron ...