Novel Aficionados and Doppelgängers: a referential task for semantic representations of individual entities

04/20/2021
by   Andrea Bruera, et al.
0

In human semantic cognition, proper names (names which refer to individual entities) are harder to learn and retrieve than common nouns. This seems to be the case for machine learning algorithms too, but the linguistic and distributional reasons for this behaviour have not been investigated in depth so far. To tackle this issue, we show that the semantic distinction between proper names and common nouns is reflected in their linguistic distributions by employing an original task for distributional semantics, the Doppelgänger test, an extensive set of models, and a new dataset, the Novel Aficionados dataset. The results indicate that the distributional representations of different individual entities are less clearly distinguishable from each other than those of common nouns, an outcome which intriguingly mirrors human cognition.

READ FULL TEXT

page 6

page 13

research
08/05/2018

Instantiation

In computational linguistics, a large body of work exists on distributed...
research
05/12/2023

Improving the Quality of Neural Machine Translation Through Proper Translation of Name Entities

In this paper, we have shown a method of improving the quality of neural...
research
05/06/2019

Distributional Semantics and Linguistic Theory

Distributional semantics provides multi-dimensional, graded, empirically...
research
09/21/2021

Grammatical Profiling for Semantic Change Detection

Semantics, morphology and syntax are strongly interdependent. However, t...
research
02/10/2019

Neural embeddings for metaphor detection in a corpus of Greek texts

One of the major challenges that NLP faces is metaphor detection, especi...
research
04/02/2018

Modeling Semantic Plausibility by Injecting World Knowledge

Distributional data tells us that a man can swallow candy, but not that ...
research
05/06/2020

What are the Goals of Distributional Semantics?

Distributional semantic models have become a mainstay in NLP, providing ...

Please sign up or login with your details

Forgot password? Click here to reset