Tapping BERT for Preposition Sense Disambiguation

11/27/2021
by   Siddhesh Pawar, et al.
5

Prepositions are frequently occurring polysemous words. Disambiguation of prepositions is crucial in tasks like semantic role labelling, question answering, text entailment, and noun compound paraphrasing. In this paper, we propose a novel methodology for preposition sense disambiguation (PSD), which does not use any linguistic tools. In a supervised setting, the machine learning model is presented with sentences wherein prepositions have been annotated with senses. These senses are IDs in what is called The Preposition Project (TPP). We use the hidden layer representations from pre-trained BERT and BERT variants. The latent representations are then classified into the correct sense ID using a Multi Layer Perceptron. The dataset used for this task is from SemEval-2007 Task-6. Our methodology gives an accuracy of 86.85 is better than the state-of-the-art.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/18/2019

Using BERT for Word Sense Disambiguation

Word Sense Disambiguation (WSD), which aims to identify the correct sens...
research
08/22/2020

FAT ALBERT: Finding Answers in Large Texts using Semantic Similarity Attention Layer based on BERT

Machine based text comprehension has always been a significant research ...
research
04/02/2022

BERT-Assisted Semantic Annotation Correction for Emotion-Related Questions

Annotated data have traditionally been used to provide the input for tra...
research
07/04/2022

Using contextual sentence analysis models to recognize ESG concepts

This paper summarizes the joint participation of the Trading Central Lab...
research
05/22/2020

Comparative Study of Machine Learning Models and BERT on SQuAD

This study aims to provide a comparative analysis of performance of cert...
research
07/21/2020

CS-NET at SemEval-2020 Task 4: Siamese BERT for ComVE

In this paper, we describe our system for Task 4 of SemEval 2020, which ...
research
11/04/2020

MTLB-STRUCT @PARSEME 2020: Capturing Unseen Multiword Expressions Using Multi-task Learning and Pre-trained Masked Language Models

This paper describes a semi-supervised system that jointly learns verbal...

Please sign up or login with your details

Forgot password? Click here to reset