Dynamic Entity Representations in Neural Language Models

08/02/2017
by   Yangfeng Ji, et al.
0

Understanding a long document requires tracking how entities are introduced and evolve over time. We present a new type of language model, EntityNLM, that can explicitly model entities, dynamically update their representations, and contextually generate their mentions. Our model is generative and flexible; it can model an arbitrary number of entities in context while generating each entity mention at an arbitrary length. In addition, it can be used for several different tasks such as language modeling, coreference resolution, and entity prediction. Experimental results with all these tasks demonstrate that our model consistently outperforms strong baselines and prior work.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/31/2019

EntEval: A Holistic Evaluation Benchmark for Entity Representations

Rich entity representations are useful for a wide class of problems invo...
research
08/30/2022

Efficient and Interpretable Neural Models for Entity Tracking

What would it take for a natural language model to understand a novel, s...
research
08/24/2018

A Trio Neural Model for Dynamic Entity Relatedness Ranking

Measuring entity relatedness is a fundamental task for many natural lang...
research
12/20/2022

Language Modeling with Latent Situations

Language models (LMs) often generate incoherent outputs: they refer to e...
research
11/14/2017

Simulating Action Dynamics with Neural Process Networks

Understanding procedural language requires anticipating the causal effec...
research
12/21/2022

Resolving Indirect Referring Expressions for Entity Selection

Recent advances in language modeling have enabled new conversational sys...
research
02/05/2019

The Referential Reader: A Recurrent Entity Network for Anaphora Resolution

We present a new architecture for storing and accessing entity mentions ...

Please sign up or login with your details

Forgot password? Click here to reset