GlossBERT: BERT for Word Sense Disambiguation with Gloss Knowledge

08/20/2019
by   Luyao Huang, et al.
0

Word Sense Disambiguation (WSD) aims to find the exact sense of an ambiguous word in a particular context. Traditional supervised methods rarely take into consideration the lexical resources like WordNet, which are widely utilized in knowledge-based methods. Recent studies have shown the effectiveness of incorporating gloss (sense definition) into neural networks for WSD. However, compared with traditional word expert supervised methods, they have not achieved much improvement. In this paper, we focus on how to better leverage gloss knowledge in a supervised neural WSD system. We construct context-gloss pairs and propose three BERT-based models for WSD. We fine-tune the pre-trained BERT model and achieve new state-of-the-art results on WSD task.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/21/2018

Incorporating Glosses into Neural Word Sense Disambiguation

Word Sense Disambiguation (WSD) aims to identify the correct meaning of ...
research
05/21/2021

Training Bi-Encoders for Word Sense Disambiguation

Modern transformer-based neural architectures yield impressive results i...
research
10/14/2021

A Dual-Attention Neural Network for Pun Location and Using Pun-Gloss Pairs for Interpretation

Pun location is to identify the punning word (usually a word or a phrase...
research
09/04/2018

A Novel Neural Sequence Model with Multiple Attentions for Word Sense Disambiguation

Word sense disambiguation (WSD) is a well researched problem in computat...
research
10/14/2021

Context-gloss Augmentation for Improving Word Sense Disambiguation

The goal of Word Sense Disambiguation (WSD) is to identify the sense of ...
research
05/22/2023

Ambiguity Meets Uncertainty: Investigating Uncertainty Estimation for Word Sense Disambiguation

Word sense disambiguation (WSD), which aims to determine an appropriate ...
research
08/26/2020

Language Models and Word Sense Disambiguation: An Overview and Analysis

Transformer-based language models have taken many fields in NLP by storm...

Please sign up or login with your details

Forgot password? Click here to reset