Multi-task Learning for Universal Sentence Representations: What Syntactic and Semantic Information is Captured?

04/21/2018
by   Wasi Uddin Ahmad, et al.
0

Learning distributed sentence representations is one of the key challenges in natural language processing. Previous work demonstrated that a recurrent neural network (RNNs) based sentence encoder trained on a large collection of annotated natural language inference data, is efficient in the transfer learning to facilitate other related tasks. In this paper, we show that joint learning of multiple tasks results in better generalizable sentence representations by conducting extensive experiments and analysis comparing the multi-task and single-task learned sentence encoders. The quantitative analysis of the syntactic and semantic information captured by the sentence embeddings show that multi-task learning captures better syntactic information while the single task learning summarizes the semantic information coherently.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/14/2018

A Hierarchical Multi-task Approach for Learning Embeddings from Semantic Tasks

Much efforts has been devoted to evaluate whether multi-task learning ca...
research
03/30/2018

Learning General Purpose Distributed Sentence Representations via Large Scale Multi-task Learning

A lot of the recent success in natural language processing (NLP) has bee...
research
02/29/2016

Representation of linguistic form and function in recurrent neural networks

We present novel methods for analyzing the activation patterns of RNNs f...
research
07/07/2019

Graph based Neural Networks for Event Factuality Prediction using Syntactic and Semantic Structures

Event factuality prediction (EFP) is the task of assessing the degree to...
research
06/19/2018

Dynamic Multi-Level Multi-Task Learning for Sentence Simplification

Sentence simplification aims to improve readability and understandabilit...
research
09/28/2018

Using Multi-task and Transfer Learning to Solve Working Memory Tasks

We propose a new architecture called Memory-Augmented Encoder-Solver (MA...
research
03/13/2017

DRAGNN: A Transition-based Framework for Dynamically Connected Neural Networks

In this work, we present a compact, modular framework for constructing n...

Please sign up or login with your details

Forgot password? Click here to reset