Cross-Lingual Transfer of Cognitive Processing Complexity

02/24/2023
by   Charlotte Pouw, et al.
0

When humans read a text, their eye movements are influenced by the structural complexity of the input sentences. This cognitive phenomenon holds across languages and recent studies indicate that multilingual language models utilize structural similarities between languages to facilitate cross-lingual transfer. We use sentence-level eye-tracking patterns as a cognitive indicator for structural complexity and show that the multilingual model XLM-RoBERTa can successfully predict varied patterns for 13 typologically diverse languages, despite being fine-tuned only on English data. We quantify the sensitivity of the model to structural complexity and distinguish a range of complexity characteristics. Our results indicate that the model develops a meaningful bias towards sentence length but also integrates cross-lingual differences. We conduct a control experiment with randomized word order and find that the model seems to additionally capture more complex structural information.

READ FULL TEXT
research
03/16/2022

Cross-Lingual Ability of Multilingual Masked Language Models: A Study of Language Structure

Multilingual pre-trained language models, such as mBERT and XLM-R, have ...
research
05/24/2023

BUFFET: Benchmarking Large Language Models for Few-shot Cross-lingual Transfer

Despite remarkable advancements in few-shot generalization in natural la...
research
02/03/2023

Modeling Sequential Sentence Relation to Improve Cross-lingual Dense Retrieval

Recently multi-lingual pre-trained language models (PLM) such as mBERT a...
research
07/31/2020

On Learning Universal Representations Across Languages

Recent studies have demonstrated the overwhelming advantage of cross-lin...
research
05/23/2023

Towards Massively Multi-domain Multilingual Readability Assessment

We present ReadMe++, a massively multi-domain multilingual dataset for a...

Please sign up or login with your details

Forgot password? Click here to reset