Contextualized Word Embeddings Encode Aspects of Human-Like Word Sense Knowledge

10/25/2020
by   Sathvik Nair, et al.
0

Understanding context-dependent variation in word meanings is a key aspect of human language comprehension supported by the lexicon. Lexicographic resources (e.g., WordNet) capture only some of this context-dependent variation; for example, they often do not encode how closely senses, or discretized word meanings, are related to one another. Our work investigates whether recent advances in NLP, specifically contextualized word embeddings, capture human-like distinctions between English word senses, such as polysemy and homonymy. We collect data from a behavioral, web-based experiment, in which participants provide judgments of the relatedness of multiple WordNet senses of a word in a two-dimensional spatial arrangement task. We find that participants' judgments of the relatedness between senses are correlated with distances between senses in the BERT embedding space. Homonymous senses (e.g., bat as mammal vs. bat as sports equipment) are reliably more distant from one another in the embedding space than polysemous ones (e.g., chicken as animal vs. chicken as meat). Our findings point towards the potential utility of continuous-space representations of sense meanings.

READ FULL TEXT

page 8

page 13

research
09/21/2022

Representing Affect Information in Word Embeddings

A growing body of research in natural language processing (NLP) and natu...
research
06/09/2019

Probing for Semantic Classes: Diagnosing the Meaning Content of Word Embeddings

Word embeddings typically represent different meanings of a word in a si...
research
12/14/2020

Model Choices Influence Attributive Word Associations: A Semi-supervised Analysis of Static Word Embeddings

Static word embeddings encode word associations, extensively utilized in...
research
09/24/2020

CogniFNN: A Fuzzy Neural Network Framework for Cognitive Word Embedding Evaluation

Word embeddings can reflect the semantic representations, and the embedd...
research
11/09/2020

Catch the "Tails" of BERT

Recently, contextualized word embeddings outperform static word embeddin...
research
08/11/2022

Word-Embeddings Distinguish Denominal and Root-Derived Verbs in Semitic

Proponents of the Distributed Morphology framework have posited the exis...
research
09/02/2019

Rotate King to get Queen: Word Relationships as Orthogonal Transformations in Embedding Space

A notable property of word embeddings is that word relationships can exi...

Please sign up or login with your details

Forgot password? Click here to reset