DeepAI AI Chat
Log In Sign Up

Multi-sense embeddings through a word sense disambiguation process

01/21/2021
by   Terry Ruas, et al.
0

Natural Language Understanding has seen an increasing number of publications in the last few years, especially after robust word embeddings models became prominent, when they proved themselves able to capture and represent semantic relationships from massive amounts of data. Nevertheless, traditional models often fall short in intrinsic issues of linguistics, such as polysemy and homonymy. Any expert system that makes use of natural language in its core, can be affected by a weak semantic representation of text, resulting in inaccurate outcomes based on poor decisions. To mitigate such issues, we propose a novel approach called Most Suitable Sense Annotation (MSSA), that disambiguates and annotates each word by its specific sense, considering the semantic effects of its context. Our approach brings three main contributions to the semantic representation scenario: (i) an unsupervised technique that disambiguates and annotates words by their senses, (ii) a multi-sense embeddings model that can be extended to any traditional word embeddings algorithm, and (iii) a recurrent methodology that allows our models to be re-used and their representations refined. We test our approach on six different benchmarks for the word similarity task, showing that our approach can produce state-of-the-art results and outperforms several more complex state-of-the-art systems.

READ FULL TEXT

page 1

page 2

page 3

page 4

01/22/2021

Enhanced word embeddings using multi-semantic representation through lexical chains

The relationship between words in a sentence often tells us more about t...
06/07/2018

Probabilistic FastText for Multi-Sense Word Embeddings

We introduce Probabilistic FastText, a new model for word embeddings tha...
12/08/2016

Embedding Words and Senses Together via Joint Knowledge-Enhanced Training

Word embeddings are widely used in Natural Language Processing, mainly d...
03/03/2018

Understanding and Improving Multi-Sense Word Embeddings via Extended Robust Principal Component Analysis

Unsupervised learned representations of polysemous words generate a larg...
05/21/2018

Aff2Vec: Affect--Enriched Distributional Word Representations

Human communication includes information, opinions, and reactions. React...
12/11/2019

CoSimLex: A Resource for Evaluating Graded Word Similarity in Context

State of the art natural language processing tools are built on context-...

Code Repositories

MSSA

Transforms text files into synset word files considering WSD via glosses and/or synset-vectors


view repo