Long Text Generation by Modeling Sentence-Level and Discourse-Level Coherence

05/19/2021
by   Jian Guan, et al.
0

Generating long and coherent text is an important but challenging task, particularly for open-ended language generation tasks such as story generation. Despite the success in modeling intra-sentence coherence, existing generation models (e.g., BART) still struggle to maintain a coherent event sequence throughout the generated text. We conjecture that this is because of the difficulty for the decoder to capture the high-level semantics and discourse structures in the context beyond token-level co-occurrence. In this paper, we propose a long text generation model, which can represent the prefix sentences at sentence level and discourse level in the decoding process. To this end, we propose two pretraining objectives to learn the representations by predicting inter-sentence semantic similarity and distinguishing between normal and shuffled sentence orders. Extensive experiments show that our model can generate more coherent texts than state-of-the-art baselines.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/17/2022

PLANET: Dynamic Content Planning in Autoregressive Transformers for Long-form Text Generation

Despite recent progress of pre-trained language models on generating flu...
research
10/12/2021

DiscoDVT: Generating Long Text with Discourse-Aware Discrete Variational Transformer

Despite the recent advances in applying pre-trained language models to g...
research
10/16/2022

Model Criticism for Long-Form Text Generation

Language models have demonstrated the ability to generate highly fluent ...
research
10/14/2020

Summarize, Outline, and Elaborate: Long-Text Generation via Hierarchical Supervision from Extractive Summaries

Long-text generation remains a challenge. The difficulty of generating c...
research
09/05/2019

TransSent: Towards Generation of Structured Sentences with Discourse Marker

This paper focuses on the task of generating long structured sentences w...
research
05/25/2022

RSTGen: Imbuing Fine-Grained Interpretable Control into Long-FormText Generators

In this paper, we study the task of improving the cohesion and coherence...
research
06/01/2019

Adversarial Generation and Encoding of Nested Texts

In this paper we propose a new language model called AGENT, which stands...

Please sign up or login with your details

Forgot password? Click here to reset