Morph Call: Probing Morphosyntactic Content of Multilingual Transformers

04/26/2021
by   Vladislav Mikhailov, et al.
3

The outstanding performance of transformer-based language models on a great variety of NLP and NLU tasks has stimulated interest in exploring their inner workings. Recent research has focused primarily on higher-level and complex linguistic phenomena such as syntax, semantics, world knowledge, and common sense. The majority of the studies are anglocentric, and little remains known regarding other languages, precisely their morphosyntactic properties. To this end, our work presents Morph Call, a suite of 46 probing tasks for four Indo-European languages of different morphology: English, French, German and Russian. We propose a new type of probing task based on the detection of guided sentence perturbations. We use a combination of neuron-, layer- and representation-level introspection techniques to analyze the morphosyntactic content of four multilingual transformers, including their less explored distilled versions. Besides, we examine how fine-tuning for POS-tagging affects the model knowledge. The results show that fine-tuning can improve and decrease the probing performance and change how morphosyntactic knowledge is distributed across the model. The code and data are publicly available, and we hope to fill the gaps in the less studied aspect of transformers.

READ FULL TEXT

page 6

page 7

page 23

page 24

research
02/28/2021

RuSentEval: Linguistic Source, Encoder Force!

The success of pre-trained transformer language models has brought a gre...
research
09/22/2021

Role of Language Relatedness in Multilingual Fine-tuning of Language Models: A Case Study in Indo-Aryan Languages

We explore the impact of leveraging the relatedness of languages that be...
research
08/14/2023

Comparison between parameter-efficient techniques and full fine-tuning: A case study on multilingual news article classification

Adapters and Low-Rank Adaptation (LoRA) are parameter-efficient fine-tun...
research
04/24/2023

KInITVeraAI at SemEval-2023 Task 3: Simple yet Powerful Multilingual Fine-Tuning for Persuasion Techniques Detection

This paper presents the best-performing solution to the SemEval 2023 Tas...
research
05/31/2023

FEED PETs: Further Experimentation and Expansion on the Disambiguation of Potentially Euphemistic Terms

Transformers have been shown to work well for the task of English euphem...
research
03/09/2022

PALI-NLP at SemEval-2022 Task 4: Discriminative Fine-tuning of Deep Transformers for Patronizing and Condescending Language Detection

Patronizing and condescending language (PCL) has a large harmful impact ...
research
08/31/2021

T3-Vis: a visual analytic framework for Training and fine-Tuning Transformers in NLP

Transformers are the dominant architecture in NLP, but their training an...

Please sign up or login with your details

Forgot password? Click here to reset