DeepAI AI Chat
Log In Sign Up

A Comprehensive Comparison of Word Embeddings in Event Entity Coreference Resolution

10/11/2021
by   Judicael Poumay, et al.
University of Liège
0

Coreference Resolution is an important NLP task and most state-of-the-art methods rely on word embeddings for word representation. However, one issue that has been largely overlooked in literature is that of comparing the performance of different embeddings across and within families in this task. Therefore, we frame our study in the context of Event and Entity Coreference Resolution (EvCR EnCR), and address two questions : 1) Is there a trade-off between performance (predictive run-time) and embedding size? 2) How do the embeddings' performance compare within and across families? Our experiments reveal several interesting findings. First, we observe diminishing returns in performance with respect to embedding size. E.g. a model using solely a character embedding achieves 86 GloVe, Character) while being 1.2 multiple embeddings learns faster overall despite being slower per epoch. However, it is still slower at test time. Finally, Elmo performs best on both EvCR and EnCR, while GloVe and FastText perform best in EvCR and EnCR respectively.

READ FULL TEXT

page 6

page 7

05/22/2019

Retrieving Multi-Entity Associations: An Evaluation of Combination Modes for Word Embeddings

Word embeddings have gained significant attention as learnable represent...
11/05/2020

Learning Efficient Task-Specific Meta-Embeddings with Word Prisms

Word embeddings are trained to predict word cooccurrence statistics, whi...
09/16/2018

Semi-Supervised Multi-Task Word Embeddings

Word embeddings have been shown to benefit from ensembling several word ...
09/12/2018

Graph Convolutional Networks based Word Embeddings

Recently, word embeddings have been widely adopted across several NLP ap...
11/06/2019

Invariance and identifiability issues for word embeddings

Word embeddings are commonly obtained as optimizers of a criterion funct...
12/21/2022

Improving Narrative Relationship Embeddings by Training with Additional Inverse-Relationship Constraints

We consider the problem of embedding character-entity relationships from...