Injecting Entity Types into Entity-Guided Text Generation

09/28/2020
by   Xiangyu Dong, et al.
0

Recent successes in deep generative modeling have led to significant advances in natural language generation (NLG). Incorporating entities into neural generation models has demonstrated great improvements by assisting to infer the summary topic and to generate coherent content. In order to enhance the role of entity in NLG, in this paper, we aim to model the entity type in the decoding phase to generate contextual words accurately. We develop a novel NLG model to produce a target sequence (i.e., a news article) based on a given list of entities. The generation quality depends significantly on whether the input entities are logically connected and expressed in the output. Our model has a multi-step decoder that injects the entity types into the process of entity mention generation. It first predicts the token of being a contextual word or an entity, then if an entity, predicts the entity mention. It effectively embeds the entity's meaning into hidden states, making the generated words precise. Experiments on two public datasets demonstrate type injection performs better than type embedding concatenation baselines.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

06/07/2019

Data-to-text Generation with Entity Modeling

Recent approaches to data-to-text generation have shown great promise th...
08/31/2019

EntEval: A Holistic Evaluation Benchmark for Entity Representations

Rich entity representations are useful for a wide class of problems invo...
09/25/2020

AutoETER: Automated Entity Type Representation for Knowledge Graph Embedding

Recent advances in Knowledge Graph Embed-ding (KGE) allow for representi...
06/14/2018

Entity Commonsense Representation for Neural Abstractive Summarization

A major proportion of a text summary includes important entities found i...
04/15/2021

Planning with Entity Chains for Abstractive Summarization

Pre-trained transformer-based sequence-to-sequence models have become th...
04/05/2019

PoMo: Generating Entity-Specific Post-Modifiers in Context

We introduce entity post-modifier generation as an instance of a collabo...
02/03/2021

A Computational Framework for Slang Generation

Slang is a common type of informal language, but its flexible nature and...

Code Repositories

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.