1Cademy at Semeval-2022 Task 1: Investigating the Effectiveness of Multilingual, Multitask, and Language-Agnostic Tricks for the Reverse Dictionary Task

06/08/2022
by   Zhiyong Wang, et al.
0

This paper describes our system for the SemEval2022 task of matching dictionary glosses to word embeddings. We focus on the Reverse Dictionary Track of the competition, which maps multilingual glosses to reconstructed vector representations. More specifically, models convert the input of sentences to three types of embeddings: SGNS, Char, and Electra. We propose several experiments for applying neural network cells, general multilingual and multitask structures, and language-agnostic tricks to the task. We also provide comparisons over different types of word embeddings and ablation studies to suggest helpful strategies. Our initial transformer-based model achieves relatively low performance. However, trials on different retokenization methodologies indicate improved performance. Our proposed Elmobased monolingual model achieves the highest outcome, and its multitask, and multilingual varieties show competitive results as well.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/28/2023

Multilingual Sentence Transformer as A Multilingual Word Aligner

Multilingual pretrained language models (mPLMs) have shown their effecti...
research
09/30/2020

BERT for Monolingual and Cross-Lingual Reverse Dictionary

Reverse dictionary is the task to find the proper target word given the ...
research
07/21/2021

Debiasing Multilingual Word Embeddings: A Case Study of Three Indian Languages

In this paper, we advance the current state-of-the-art method for debias...
research
12/14/2016

Multilingual Word Embeddings using Multigraphs

We present a family of neural-network--inspired models for computing con...
research
06/30/2016

Learning Crosslingual Word Embeddings without Bilingual Corpora

Crosslingual word embeddings represent lexical items from different lang...
research
05/14/2019

Multilingual Factor Analysis

In this work we approach the task of learning multilingual word represen...

Please sign up or login with your details

Forgot password? Click here to reset