Generating Sense Embeddings for Syntactic and Semantic Analogy for Portuguese

Word embeddings are numerical vectors which can represent words or concepts in a low-dimensional continuous space. These vectors are able to capture useful syntactic and semantic information. The traditional approaches like Word2Vec, GloVe and FastText have a strict drawback: they produce a single vector representation per word ignoring the fact that ambiguous words can assume different meanings. In this paper we use techniques to generate sense embeddings and present the first experiments carried out for Portuguese. Our experiments show that sense vectors outperform traditional word vectors in syntactic and semantic analogy tasks, proving that the language resource generated here can improve the performance of NLP tasks in Portuguese.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/31/2021

Sense representations for Portuguese: experiments with sense embeddings and deep neural language models

Sense representations have gone beyond word representations like Word2Ve...
research
11/19/2015

sense2vec - A Fast and Accurate Method for Word Sense Disambiguation In Neural Word Embeddings

Neural word representations have proven useful in Natural Language Proce...
research
06/28/2021

Word2Box: Learning Word Representation Using Box Embeddings

Learning vector representations for words is one of the most fundamental...
research
02/25/2020

Semantic Relatedness for Keyword Disambiguation: Exploiting Different Embeddings

Understanding the meaning of words is crucial for many tasks that involv...
research
02/18/2017

Reproducing and learning new algebraic operations on word embeddings using genetic programming

Word-vector representations associate a high dimensional real-vector to ...
research
04/10/2017

Exploring Word Embeddings for Unsupervised Textual User-Generated Content Normalization

Text normalization techniques based on rules, lexicons or supervised tra...
research
08/14/2018

Syntree2Vec - An algorithm to augment syntactic hierarchy into word embeddings

Word embeddings aims to map sense of the words into a lower dimensional ...

Please sign up or login with your details

Forgot password? Click here to reset