Dis-S2V: Discourse Informed Sen2Vec

10/25/2016
by   Tanay Kumar Saha, et al.
0

Vector representation of sentences is important for many text processing tasks that involve clustering, classifying, or ranking sentences. Recently, distributed representation of sentences learned by neural models from unlabeled data has been shown to outperform the traditional bag-of-words representation. However, most of these learning methods consider only the content of a sentence and disregard the relations among sentences in a discourse by and large. In this paper, we propose a series of novel models for learning latent representations of sentences (Sen2Vec) that consider the content of a sentence as well as inter-sentence relations. We first represent the inter-sentence relations with a language network and then use the network to induce contextual information into the content-based Sen2Vec models. Two different approaches are introduced to exploit the information in the network. Our first approach retrofits (already trained) Sen2Vec vectors with respect to the network in two different ways: (1) using the adjacency relations of a node, and (2) using a stochastic sampling method which is more flexible in sampling neighbors of a node. The second approach uses a regularizer to encode the information in the network into the existing Sen2Vec model. Experimental results show that our proposed models outperform existing methods in three fundamental information system tasks demonstrating the effectiveness of our approach. The models leverage the computational power of multi-core CPUs to achieve fine-grained computational efficiency. We make our code publicly available upon acceptance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/12/2017

DisSent: Sentence Representation Learning from Explicit Discourse Relations

Sentence vectors represent an appealing approach to meaning: learn an em...
research
06/19/2023

Distributed Marker Representation for Ambiguous Discourse Markers and Entangled Relations

Discourse analysis is an important task because it models intrinsic sema...
research
08/30/2019

Linguistic Versus Latent Relations for Modeling Coherent Flow in Paragraphs

Generating a long, coherent text such as a paragraph requires a high-lev...
research
05/20/2020

Pretraining with Contrastive Sentence Objectives Improves Discourse Performance of Language Models

Recent models for unsupervised representation learning of text have empl...
research
05/24/2021

Context-Preserving Text Simplification

We present a context-preserving text simplification (TS) approach that r...
research
08/22/2018

An Attention-Gated Convolutional Neural Network for Sentence Classification

The classification task of sentences is very challenging because of the ...
research
05/14/2017

Joint Modeling of Content and Discourse Relations in Dialogues

We present a joint modeling approach to identify salient discussion poin...

Please sign up or login with your details

Forgot password? Click here to reset