Learning Cross-Context Entity Representations from Text

01/11/2020
by   Jeffrey Ling, et al.
0

Language modeling tasks, in which words, or word-pieces, are predicted on the basis of a local context, have been very effective for learning word embeddings and context dependent representations of phrases. Motivated by the observation that efforts to code world knowledge into machine readable knowledge bases or human readable encyclopedias tend to be entity-centric, we investigate the use of a fill-in-the-blank task to learn context independent representations of entities from the text contexts in which those entities were mentioned. We show that large scale training of neural models allows us to learn high quality entity representations, and we demonstrate successful results on four domains: (1) existing entity-level typing benchmarks, including a 64 over previous work on TypeNet (Murty et al., 2018); (2) a novel few-shot category reconstruction task; (3) existing entity linking benchmarks, where we match the state-of-the-art on CoNLL-Aida without linking-specific features and obtain a score of 89.8 knowledge base or in domain training data and (4) answering trivia questions, which uniquely identify entities. Our global entity representations encode fine-grained type categories, such as Scottish footballers, and can answer trivia questions such as: Who was the last inmate of Spandau jail in Berlin?

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/08/2017

Multi-level Representations for Fine-Grained Typing of Knowledge Base Entities

Entities are essential elements of natural language. In this paper, we p...
research
05/16/2019

What do Entity-Centric Models Learn? Insights from Entity Linking in Multi-Party Dialogue

Humans use language to refer to entities in the external world. Motivate...
research
04/15/2020

Entities as Experts: Sparse Memory Access with Entity Supervision

We focus on the problem of capturing declarative knowledge in the learne...
research
09/09/2019

Knowledge Enhanced Contextual Word Representations

Contextual word representations, typically trained on unstructured, unla...
research
06/15/2023

Neural models for Factual Inconsistency Classification with Explanations

Factual consistency is one of the most important requirements when editi...
research
05/27/2023

Towards Better Entity Linking with Multi-View Enhanced Distillation

Dense retrieval is widely used for entity linking to retrieve entities f...
research
03/10/2016

Building a Fine-Grained Entity Typing System Overnight for a New X (X = Language, Domain, Genre)

Recent research has shown great progress on fine-grained entity typing. ...

Please sign up or login with your details

Forgot password? Click here to reset