A Hierarchical Multi-task Approach for Learning Embeddings from Semantic Tasks

11/14/2018
by   Victor Sanh, et al.
0

Much efforts has been devoted to evaluate whether multi-task learning can be leveraged to learn rich representations that can be used in various Natural Language Processing (NLP) down-stream applications. However, there is still a lack of understanding of the settings in which multi-task learning has a significant effect. In this work, we introduce a hierarchical model trained in a multi-task learning setup on a set of carefully selected semantic tasks. The model is trained in a hierarchical fashion to introduce an inductive bias by supervising a set of low level tasks at the bottom layers of the model and more complex tasks at the top layers of the model. This model achieves state-of-the-art results on a number of tasks, namely Named Entity Recognition, Entity Mention Detection and Relation Extraction without hand-engineered features or external NLP tools like syntactic parsers. The hierarchical training supervision induces a set of shared semantic representations at lower layers of the model. We show that as we move from the bottom to the top layers of the model, the hidden states of the layers tend to represent more complex semantic information.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/21/2018

Multi-task Learning for Universal Sentence Representations: What Syntactic and Semantic Information is Captured?

Learning distributed sentence representations is one of the key challeng...
research
04/25/2020

Hierarchical Multi Task Learning with Subword Contextual Embeddings for Languages with Rich Morphology

Morphological information is important for many sequence labeling tasks ...
research
05/06/2020

An Empirical Study of Multi-Task Learning on BERT for Biomedical Text Mining

Multi-task learning (MTL) has achieved remarkable success in natural lan...
research
01/29/2019

Two-Stream Multi-Task Network for Fashion Recognition

In this paper, we present a two-stream multi-task network for fashion re...
research
12/14/2018

A Neural Multi-Task Learning Framework to Jointly Model Medical Named Entity Recognition and Normalization

State-of-the-art studies have demonstrated the superiority of joint mode...
research
11/08/2022

Nimbus: Toward Speed Up Function Signature Recovery via Input Resizing and Multi-Task Learning

Function signature recovery is important for many binary analysis tasks ...
research
05/18/2020

Efficient Image Gallery Representations at Scale Through Multi-Task Learning

Image galleries provide a rich source of diverse information about a pro...

Please sign up or login with your details

Forgot password? Click here to reset