Exploring the Representation of Word Meanings in Context: A Case Study on Homonymy and Synonymy

06/25/2021
by   Marcos Garcia, et al.
0

This paper presents a multilingual study of word meaning representations in context. We assess the ability of both static and contextualized models to adequately represent different lexical-semantic relations, such as homonymy and synonymy. To do so, we created a new multilingual dataset that allows us to perform a controlled evaluation of several factors such as the impact of the surrounding context or the overlap between words, conveying the same or different senses. A systematic assessment on four scenarios shows that the best monolingual models based on Transformers can adequately disambiguate homonyms in context. However, as they rely heavily on context, these models fail at representing words with different senses when occurring in similar sentences. Experiments are performed in Galician, Portuguese, English, and Spanish, and both the dataset (with more than 3,000 evaluation items) and new models are freely released with this study.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/22/2019

Enhancing Semantic Word Representations by Embedding Deeper Word Relationships

Word representations are created using analogy context-based statistics ...
research
12/13/2021

Context vs Target Word: Quantifying Biases in Lexical Semantic Datasets

State-of-the-art contextualized models such as BERT use tasks such as Wi...
research
11/14/2015

Learning to Represent Words in Context with Multilingual Supervision

We present a neural network architecture based on bidirectional LSTMs to...
research
03/24/2021

Czert – Czech BERT-like Model for Language Representation

This paper describes the training process of the first Czech monolingual...
research
12/14/2016

Multilingual Word Embeddings using Multigraphs

We present a family of neural-network--inspired models for computing con...
research
10/13/2020

XL-WiC: A Multilingual Benchmark for Evaluating Semantic Contextualization

The ability to correctly model distinct meanings of a word is crucial fo...
research
04/19/2017

Redefining Context Windows for Word Embedding Models: An Experimental Study

Distributional semantic models learn vector representations of words thr...

Please sign up or login with your details

Forgot password? Click here to reset