Adapting BERT for Word Sense Disambiguation with Gloss Selection Objective and Example Sentences

09/24/2020
by   Boon Peng Yap, et al.
0

Domain adaptation or transfer learning using pre-trained language models such as BERT has proven to be an effective approach for many natural language processing tasks. In this work, we propose to formulate word sense disambiguation as a relevance ranking task, and fine-tune BERT on sequence-pair ranking task to select the most probable sense definition given a context sentence and a list of candidate sense definitions. We also introduce a data augmentation technique for WSD using existing example sentences from WordNet. Using the proposed training objective and data augmentation technique, our models are able to achieve state-of-the-art results on the English all-words benchmark datasets.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

06/15/2021

Incorporating Word Sense Disambiguation in Neural Language Models

We present two supervised (pre-)training methods to incorporate gloss de...
10/14/2021

Context-gloss Augmentation for Improving Word Sense Disambiguation

The goal of Word Sense Disambiguation (WSD) is to identify the sense of ...
10/12/2020

EFSG: Evolutionary Fooling Sentences Generator

Large pre-trained language representation models (LMs) have recently col...
06/23/2021

Classifying Textual Data with Pre-trained Vision Models through Transfer Learning and Data Transformations

Knowledge is acquired by humans through experience, and no boundary is s...
04/30/2020

WiC-TSV: An Evaluation Benchmark for Target Sense Verification of Words in Context

In this paper, we present WiC-TSV (Target Sense Verification for Words i...
09/20/2021

BERT Has Uncommon Sense: Similarity Ranking for Word Sense BERTology

An important question concerning contextualized word embedding (CWE) mod...
05/22/2020

L2R2: Leveraging Ranking for Abductive Reasoning

The abductive natural language inference task (αNLI) is proposed to eval...

Code Repositories

BERT-WSD

[EMNLP 2020] Adapting BERT for Word Sense Disambiguation with Gloss Selection Objective and Example Sentences


view repo
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.