Sense representations for Portuguese: experiments with sense embeddings and deep neural language models

Sense representations have gone beyond word representations like Word2Vec, GloVe and FastText and achieved innovative performance on a wide range of natural language processing tasks. Although very useful in many applications, the traditional approaches for generating word embeddings have a strict drawback: they produce a single vector representation for a given word ignoring the fact that ambiguous words can assume different meanings. In this paper, we explore unsupervised sense representations which, different from traditional word embeddings, are able to induce different senses of a word by analyzing its contextual semantics in a text. The unsupervised sense representations investigated in this paper are: sense embeddings and deep neural language models. We present the first experiments carried out for generating sense embeddings for Portuguese. Our experiments show that the sense embedding model (Sense2vec) outperformed traditional word embeddings in syntactic and semantic analogies task, proving that the language resource generated here can improve the performance of NLP tasks in Portuguese. We also evaluated the performance of pre-trained deep neural language models (ELMo and BERT) in two transfer learning approaches: feature based and fine-tuning, in the semantic textual similarity task. Our experiments indicate that the fine tuned Multilingual and Portuguese BERT language models were able to achieve better accuracy than the ELMo model and baselines.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/21/2020

Generating Sense Embeddings for Syntactic and Semantic Analogy for Portuguese

Word embeddings are numerical vectors which can represent words or conce...
research
04/20/2023

Word Sense Induction with Knowledge Distillation from BERT

Pre-trained contextual language models are ubiquitously employed for lan...
research
09/21/2022

Representing Affect Information in Word Embeddings

A growing body of research in natural language processing (NLP) and natu...
research
04/22/2021

Low Anisotropy Sense Retrofitting (LASeR) : Towards Isotropic and Sense Enriched Representations

Contextual word representation models have shown massive improvements on...
research
01/21/2021

Multi-sense embeddings through a word sense disambiguation process

Natural Language Understanding has seen an increasing number of publicat...
research
06/09/2021

Low-Dimensional Structure in the Space of Language Representations is Reflected in Brain Responses

How related are the representations learned by neural language models, t...
research
09/17/2022

Unsupervised Lexical Substitution with Decontextualised Embeddings

We propose a new unsupervised method for lexical substitution using pre-...

Please sign up or login with your details

Forgot password? Click here to reset