Improving the Coverage and the Generalization Ability of Neural Word Sense Disambiguation through Hypernymy and Hyponymy Relationships

11/02/2018
by   Loïc Vial, et al.
0

In Word Sense Disambiguation (WSD), the predominant approach generally involves a supervised system trained on sense annotated corpora. The limited quantity of such corpora however restricts the coverage and the performance of these systems. In this article, we propose a new method that solves these issues by taking advantage of the knowledge present in WordNet, and especially the hypernymy and hyponymy relationships between synsets, in order to reduce the number of different sense tags that are necessary to disambiguate all words of the lexical database. Our method leads to state of the art results on most WSD evaluation tasks, while improving the coverage of supervised systems, reducing the training time and the size of the models, without additional training data. In addition, we exhibit results that significantly outperform the state of the art when our method is combined with an ensembling technique and the addition of the WordNet Gloss Tagged as training corpus.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/14/2019

Sense Vocabulary Compression through the Semantic Knowledge of WordNet for Neural Word Sense Disambiguation

In this article, we tackle the issue of the limited quantity of manually...
research
04/29/2020

Don't Neglect the Obvious: On the Role of Unambiguous Words in Word Sense Disambiguation

State-of-the-art methods for Word Sense Disambiguation (WSD) combine two...
research
06/11/2021

Semi-Supervised and Unsupervised Sense Annotation via Translations

Acquisition of multilingual training data continues to be a challenge in...
research
02/13/2018

A Short Survey on Sense-Annotated Corpora for Diverse Languages and Resources

With the advancement of research in word sense disambiguation and deep l...
research
09/06/2019

To lemmatize or not to lemmatize: how word normalisation affects ELMo performance in word sense disambiguation

We critically evaluate the widespread assumption that deep learning NLP ...
research
09/20/2021

BERT Has Uncommon Sense: Similarity Ranking for Word Sense BERTology

An important question concerning contextualized word embedding (CWE) mod...
research
09/22/2000

A Comparison between Supervised Learning Algorithms for Word Sense Disambiguation

This paper describes a set of comparative experiments, including cross-c...

Please sign up or login with your details

Forgot password? Click here to reset