Linguistic Versus Latent Relations for Modeling Coherent Flow in Paragraphs

08/30/2019
by   Dongyeop Kang, et al.
1

Generating a long, coherent text such as a paragraph requires a high-level control of different levels of relations between sentences (e.g., tense, coreference). We call such a logical connection between sentences as a (paragraph) flow. In order to produce a coherent flow of text, we explore two forms of intersentential relations in a paragraph: one is a human-created linguistical relation that forms a structure (e.g., discourse tree) and the other is a relation from latent representation learned from the sentences themselves. Our two proposed models incorporate each form of relations into document-level language models: the former is a supervised model that jointly learns a language model as well as discourse relation prediction, and the latter is an unsupervised model that is hierarchically conditioned by a recurrent neural network (RNN) over the latent information. Our proposed models with both forms of relations outperform the baselines in partially conditioned paragraph generation task. Our codes and data are publicly available.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/07/2016

A Latent Variable Recurrent Neural Network for Discourse Relation Language Models

This paper presents a novel latent variable recurrent neural network arc...
research
11/12/2015

Document Context Language Models

Text documents are structured on multiple levels of detail: individual w...
research
04/08/2020

Generating Narrative Text in a Switching Dynamical System

Early work on narrative modeling used explicit plans and goals to genera...
research
10/25/2016

Dis-S2V: Discourse Informed Sen2Vec

Vector representation of sentences is important for many text processing...
research
12/17/2014

Entity-Augmented Distributional Semantics for Discourse Relations

Discourse relations bind smaller linguistic elements into coherent texts...
research
10/16/2022

Model Criticism for Long-Form Text Generation

Language models have demonstrated the ability to generate highly fluent ...
research
05/10/2018

Discourse-Aware Neural Rewards for Coherent Text Generation

In this paper, we investigate the use of discourse-aware rewards with re...

Please sign up or login with your details

Forgot password? Click here to reset