YELM: End-to-End Contextualized Entity Linking

11/10/2019
by   Haotian Chen, et al.
0

We propose yet another entity linking model (YELM) which links words to entities instead of spans. This overcomes any difficulties associated with the selection of good candidate mention spans and makes the joint training of mention detection (MD) and entity disambiguation (ED) easily possible. Our model is based on BERT and produces contextualized word embeddings which are trained against a joint MD and ED objective. We achieve state-of-the-art results on several standard entity linking (EL) datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/06/2020

Improving Entity Linking by Modeling Latent Entity Type Information

Existing state of the art neural entity linking models employ attention-...
research
06/16/2021

Improving Entity Linking through Semantic Reinforced Entity Embeddings

Entity embeddings, which represent different aspects of each entity with...
research
05/28/2020

Empirical Evaluation of Pretraining Strategies for Supervised Entity Linking

In this work, we present an entity linking model which combines a Transf...
research
03/11/2020

Investigating Entity Knowledge in BERT with Simple Neural End-To-End Entity Linking

A typical architecture for end-to-end entity linking systems consists of...
research
08/23/2018

End-to-End Neural Entity Linking

Entity Linking (EL) is an essential task for semantic text understanding...
research
11/11/2016

Neural Networks Models for Entity Discovery and Linking

This paper describes the USTC_NELSLIP systems submitted to the Trilingua...
research
05/24/2023

A Fair and In-Depth Evaluation of Existing End-to-End Entity Linking Systems

Existing evaluations of entity linking systems often say little about ho...

Please sign up or login with your details

Forgot password? Click here to reset