A Generalized Recurrent Neural Architecture for Text Classification with Multi-Task Learning

07/10/2017
by   Honglun Zhang, et al.
0

Multi-task learning leverages potential correlations among related tasks to extract common features and yield performance gains. However, most previous works only consider simple or weak interactions, thereby failing to model complex correlations among three or more tasks. In this paper, we propose a multi-task learning architecture with four types of recurrent neural layers to fuse information across multiple related tasks. The architecture is structurally flexible and considers various interactions among tasks, which can be regarded as a generalized case of many previous works. Extensive experiments on five benchmark datasets for text classification show that our model can significantly improve performances of related tasks with additional information from others.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/17/2017

Multi-Task Label Embedding for Text Classification

Multi-task learning in text classification leverages implicit correlatio...
research
05/17/2016

Recurrent Neural Network for Text Classification with Multi-Task Learning

Neural network based methods have obtained great progress on a variety o...
research
11/10/2020

Multi-Task Sequence Prediction For Tunisian Arabizi Multi-Level Annotation

In this paper we propose a multi-task sequence prediction system, based ...
research
11/19/2017

Compression-Based Regularization with an Application to Multi-Task Learning

This paper investigates, from information theoretic grounds, a learning ...
research
11/18/2018

Neural Multi-Task Learning for Citation Function and Provenance

Citation function and provenance are two cornerstone tasks in citation a...
research
06/20/2021

Automated Deepfake Detection

In this paper, we propose to utilize Automated Machine Learning to autom...
research
03/06/2022

On Steering Multi-Annotations per Sample for Multi-Task Learning

The study of multi-task learning has drawn great attention from the comm...

Please sign up or login with your details

Forgot password? Click here to reset