A Unified Encoder-Decoder Framework with Entity Memory

10/07/2022
by   Zhihan Zhang, et al.
0

Entities, as important carriers of real-world knowledge, play a key role in many NLP tasks. We focus on incorporating entity knowledge into an encoder-decoder framework for informative text generation. Existing approaches tried to index, retrieve, and read external documents as evidence, but they suffered from a large computational overhead. In this work, we propose an encoder-decoder framework with an entity memory, namely EDMem. The entity knowledge is stored in the memory as latent representations, and the memory is pre-trained on Wikipedia along with encoder-decoder parameters. To precisely generate entity names, we design three decoding methods to constrain entity generation by linking entities in the memory. EDMem is a unified framework that can be used on various entity-intensive question answering and generation tasks. Extensive experimental results show that EDMem outperforms both memory-based auto-encoder models and non-memory encoder-decoder models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/02/2020

Autoregressive Entity Retrieval

Entities are at the center of how we represent and aggregate knowledge. ...
research
05/02/2023

UNTER: A Unified Knowledge Interface for Enhancing Pre-trained Language Models

Recent research demonstrates that external knowledge injection can advan...
research
07/26/2018

Variational Memory Encoder-Decoder

Introducing variability while maintaining coherence is a core task in le...
research
04/26/2023

Understand the Dynamic World: An End-to-End Knowledge Informed Framework for Open Domain Entity State Tracking

Open domain entity state tracking aims to predict reasonable state chang...
research
09/28/2020

Injecting Entity Types into Entity-Guided Text Generation

Recent successes in deep generative modeling have led to significant adv...
research
06/12/2021

Evaluating Entity Disambiguation and the Role of Popularity in Retrieval-Based NLP

Retrieval is a core component for open-domain NLP tasks. In open-domain ...
research
06/01/2023

Hierarchical Attention Encoder Decoder

Recent advances in large language models have shown that autoregressive ...

Please sign up or login with your details

Forgot password? Click here to reset