Injecting Entity Types into Entity-Guided Text Generation

09/28/2020
by   Xiangyu Dong, et al.
0

Recent successes in deep generative modeling have led to significant advances in natural language generation (NLG). Incorporating entities into neural generation models has demonstrated great improvements by assisting to infer the summary topic and to generate coherent content. In order to enhance the role of entity in NLG, in this paper, we aim to model the entity type in the decoding phase to generate contextual words accurately. We develop a novel NLG model to produce a target sequence (i.e., a news article) based on a given list of entities. The generation quality depends significantly on whether the input entities are logically connected and expressed in the output. Our model has a multi-step decoder that injects the entity types into the process of entity mention generation. It first predicts the token of being a contextual word or an entity, then if an entity, predicts the entity mention. It effectively embeds the entity's meaning into hidden states, making the generated words precise. Experiments on two public datasets demonstrate type injection performs better than type embedding concatenation baselines.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/08/2022

Generating Coherent Narratives by Learning Dynamic and Discrete Entity States with a Contrastive Framework

Despite advances in generating fluent texts, existing pretraining models...
research
06/07/2019

Data-to-text Generation with Entity Modeling

Recent approaches to data-to-text generation have shown great promise th...
research
11/02/2022

Generative Entity-to-Entity Stance Detection with Knowledge Graph Augmentation

Stance detection is typically framed as predicting the sentiment in a gi...
research
10/07/2022

A Unified Encoder-Decoder Framework with Entity Memory

Entities, as important carriers of real-world knowledge, play a key role...
research
04/15/2021

Planning with Entity Chains for Abstractive Summarization

Pre-trained transformer-based sequence-to-sequence models have become th...
research
04/05/2019

PoMo: Generating Entity-Specific Post-Modifiers in Context

We introduce entity post-modifier generation as an instance of a collabo...
research
02/03/2021

A Computational Framework for Slang Generation

Slang is a common type of informal language, but its flexible nature and...

Please sign up or login with your details

Forgot password? Click here to reset