Semi Supervised Preposition-Sense Disambiguation using Multilingual Data

11/27/2016
by   Hila Gonen, et al.
0

Prepositions are very common and very ambiguous, and understanding their sense is critical for understanding the meaning of the sentence. Supervised corpora for the preposition-sense disambiguation task are small, suggesting a semi-supervised approach to the task. We show that signals from unannotated multilingual data can be used to improve supervised preposition-sense disambiguation. Our approach pre-trains an LSTM encoder for predicting the translation of a preposition, and then incorporates the pre-trained encoder as a component in a supervised classification system, and fine-tunes it for the task. The multilingual signals consistently improve results on two preposition-sense datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/11/2021

Semi-Supervised and Unsupervised Sense Annotation via Translations

Acquisition of multilingual training data continues to be a challenge in...
research
08/04/2020

NLPDove at SemEval-2020 Task 12: Improving Offensive Language Detection with Cross-lingual Transfer

This paper describes our approach to the task of identifying offensive l...
research
05/12/2018

Huge Automatically Extracted Training Sets for Multilingual Word Sense Disambiguation

We release to the community six large-scale sense-annotated datasets in ...
research
06/25/2017

Beyond Bilingual: Multi-sense Word Embeddings using Multilingual Context

Word embeddings, which represent a word as a point in a vector space, ha...
research
11/11/2021

Multilingual and Multilabel Emotion Recognition using Virtual Adversarial Training

Virtual Adversarial Training (VAT) has been effective in learning robust...
research
08/18/2016

Multilingual Modal Sense Classification using a Convolutional Neural Network

Modal sense classification (MSC) is a special WSD task that depends on t...
research
05/22/2023

Extrapolating Multilingual Understanding Models as Multilingual Generators

Multilingual understanding models (or encoder-based), pre-trained via ma...

Please sign up or login with your details

Forgot password? Click here to reset