Empirical Evaluation of Pretraining Strategies for Supervised Entity Linking

05/28/2020
by   Thibault Févry, et al.
0

In this work, we present an entity linking model which combines a Transformer architecture with large scale pretraining from Wikipedia links. Our model achieves the state-of-the-art on two commonly used entity linking datasets: 96.7 what design choices are important for entity linking, including choices of negative entity candidates, Transformer architecture, and input perturbations. Lastly, we present promising results on more challenging settings such as end-to-end entity linking and entity linking without in-domain training data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/10/2019

YELM: End-to-End Contextualized Entity Linking

We propose yet another entity linking model (YELM) which links words to ...
research
01/25/2021

CHOLAN: A Modular Approach for Neural Entity Linking on Wikipedia and Wikidata

In this paper, we propose CHOLAN, a modular approach to target end-to-en...
research
04/27/2018

Improving Entity Linking by Modeling Latent Relations between Mentions

Entity linking involves aligning textual mentions of named entities to t...
research
08/31/2020

PNEL: Pointer Network based End-To-End Entity Linking over Knowledge Graphs

Question Answering systems are generally modelled as a pipeline consisti...
research
06/02/2020

REL: An Entity Linker Standing on the Shoulders of Giants

Entity linking is a standard component in modern retrieval system that i...
research
05/31/2021

A Multilingual Entity Linking System for Wikipedia with a Machine-in-the-Loop Approach

Hyperlinks constitute the backbone of the Web; they enable user navigati...
research
02/05/2019

The Referential Reader: A Recurrent Entity Network for Anaphora Resolution

We present a new architecture for storing and accessing entity mentions ...

Please sign up or login with your details

Forgot password? Click here to reset