Zhestyatsky at SemEval-2021 Task 2: ReLU over Cosine Similarity for BERT Fine-tuning

04/13/2021
by   Boris Zhestiankin, et al.
0

This paper presents our contribution to SemEval-2021 Task 2: Multilingual and Cross-lingual Word-in-Context Disambiguation (MCL-WiC). Our experiments cover English (EN-EN) sub-track from the multilingual setting of the task. We experiment with several pre-trained language models and investigate an impact of different top-layers on fine-tuning. We find the combination of Cosine Similarity and ReLU activation leading to the most effective fine-tuning procedure. Our best model results in accuracy 92.7 score in EN-EN sub-track.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/08/2021

Uppsala NLP at SemEval-2021 Task 2: Multilingual Language Models for Fine-tuning and Feature Extraction in Word-in-Context Disambiguation

We describe the Uppsala NLP submission to SemEval-2021 Task 2 on multili...
research
07/24/2020

MULTISEM at SemEval-2020 Task 3: Fine-tuning BERT for Lexical Meaning

We present the MULTISEM systems submitted to SemEval 2020 Task 3: Graded...
research
05/03/2023

Plan, Eliminate, and Track – Language Models are Good Teachers for Embodied Agents

Pre-trained large language models (LLMs) capture procedural knowledge ab...
research
06/05/2022

Exploring Cross-lingual Textual Style Transfer with Large Multilingual Language Models

Detoxification is a task of generating text in polite style while preser...
research
09/14/2021

On the Language-specificity of Multilingual BERT and the Impact of Fine-tuning

Recent work has shown evidence that the knowledge acquired by multilingu...
research
09/26/2021

On the Prunability of Attention Heads in Multilingual BERT

Large multilingual models, such as mBERT, have shown promise in crosslin...
research
02/21/2022

Embarrassingly Simple Performance Prediction for Abductive Natural Language Inference

The task of abductive natural language inference (αnli), to decide which...

Please sign up or login with your details

Forgot password? Click here to reset