Attention Can Reflect Syntactic Structure (If You Let It)

01/26/2021
by   Vinit Ravishankar, et al.
13

Since the popularization of the Transformer as a general-purpose feature encoder for NLP, many studies have attempted to decode linguistic structure from its novel multi-head attention mechanism. However, much of such work focused almost exclusively on English – a language with rigid word order and a lack of inflectional morphology. In this study, we present decoding experiments for multilingual BERT across 18 languages in order to test the generalizability of the claim that dependency syntax is reflected in attention patterns. We show that full trees can be decoded above baseline accuracy from single attention heads, and that individual relations are often tracked by the same heads across languages. Furthermore, in an attempt to address recent debates about the status of attention as an explanatory mechanism, we experiment with fine-tuning mBERT on a supervised parsing objective while freezing different series of parameters. Interestingly, in steering the objective to learn explicit linguistic structure, we find much of the same structure represented in the resulting attention patterns, with interesting differences with respect to which parameters are frozen.

READ FULL TEXT

page 5

page 6

page 8

page 13

page 14

research
11/27/2019

Do Attention Heads in BERT Track Syntactic Dependencies?

We investigate the extent to which individual attention heads in pretrai...
research
01/27/2021

On the Evolution of Syntactic Information Encoded by BERT's Contextualized Representations

The adaptation of pretrained language models to solve supervised tasks h...
research
05/01/2020

Cross-Linguistic Syntactic Evaluation of Word Prediction Models

A range of studies have concluded that neural word prediction models can...
research
04/29/2020

Do Neural Language Models Show Preferences for Syntactic Formalisms?

Recent work on the interpretability of deep neural language models has c...
research
02/01/2023

Inference of Partial Colexifications from Multilingual Wordlists

The past years have seen a drastic rise in studies devoted to the invest...
research
05/09/2020

It's Morphin' Time! Combating Linguistic Discrimination with Inflectional Perturbations

Training on only perfect Standard English corpora predisposes pre-traine...
research
10/18/2021

Compositional Attention: Disentangling Search and Retrieval

Multi-head, key-value attention is the backbone of the widely successful...

Please sign up or login with your details

Forgot password? Click here to reset