Mention Memory: incorporating textual knowledge into Transformers through entity mention attention

10/12/2021
by   Michiel de Jong, et al.
0

Natural language understanding tasks such as open-domain question answering often require retrieving and assimilating factual information from multiple sources. We propose to address this problem by integrating a semi-parametric representation of a large text corpus into a Transformer model as a source of factual knowledge. Specifically, our method represents knowledge with `mention memory', a table of dense vector representations of every entity mention in a corpus. The proposed model - TOME - is a Transformer that accesses the information through internal memory layers in which each entity mention in the input passage attends to the mention memory. This approach enables synthesis of and reasoning over many disparate sources of information within a single Transformer model. In experiments using a memory of 150 million Wikipedia mentions, TOME achieves strong performance on several open-domain knowledge-intensive tasks, including the claim verification benchmarks HoVer and FEVER and several entity-based QA benchmarks. We also show that the model learns to attend to informative mentions without any direct supervision. Finally we demonstrate that the model can generalize to new unseen entities by updating the memory without retraining.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/02/2020

LUKE: Deep Contextualized Entity Representations with Entity-aware Self-attention

Entity representations are useful in natural language tasks involving en...
research
09/04/2020

KILT: a Benchmark for Knowledge Intensive Language Tasks

Challenging problems such as open-domain question answering, fact checki...
research
12/30/2020

A Memory Efficient Baseline for Open Domain Question Answering

Recently, retrieval systems based on dense representations have led to i...
research
11/15/2022

Breakpoint Transformers for Modeling and Tracking Intermediate Beliefs

Can we teach natural language understanding models to track their belief...
research
10/11/2022

Entity Disambiguation with Entity Definitions

Local models have recently attained astounding performances in Entity Di...
research
10/04/2022

Transformer-based Subject Entity Detection in Wikipedia Listings

In tasks like question answering or text summarisation, it is essential ...
research
04/15/2020

Entities as Experts: Sparse Memory Access with Entity Supervision

We focus on the problem of capturing declarative knowledge in the learne...

Please sign up or login with your details

Forgot password? Click here to reset