DeepAI
Log In Sign Up

Universal Dependencies according to BERT: both more specific and more general

04/30/2020
by   Tomasz Limisiewicz, et al.
0

This work focuses on analyzing the form and extent of syntactic abstraction captured by BERT by extracting labeled dependency trees from self-attentions. Previous work showed that individual BERT heads tend to encode particular dependency relation types. We extend these findings by explicitly comparing BERT relations to Universal Dependencies (UD) annotations, showing that they often do not match one-to-one. We suggest a method for relation identification and syntactic tree construction. Our approach produces significantly more consistent dependency trees than previous work, showing that it better explains the syntactic abstractions in BERT. At the same time, it can be successfully applied with only a minimal amount of supervision and generalizes well across languages.

READ FULL TEXT
11/27/2019

Do Attention Heads in BERT Track Syntactic Dependencies?

We investigate the extent to which individual attention heads in pretrai...
04/29/2020

Do Neural Language Models Show Preferences for Syntactic Formalisms?

Recent work on the interpretability of deep neural language models has c...
05/28/2018

Syntactic Dependency Representations in Neural Relation Classification

We investigate the use of different syntactic dependency representations...
05/09/2020

Finding Universal Grammatical Relations in Multilingual BERT

Recent work has found evidence that Multilingual BERT (mBERT), a transfo...
11/11/2020

Multilingual Irony Detection with Dependency Syntax and Neural Models

This paper presents an in-depth investigation of the effectiveness of de...
05/04/2020

pyBART: Evidence-based Syntactic Transformations for IE

Syntactic dependencies can be predicted with high accuracy, and are usef...
12/09/2021

How Universal is Genre in Universal Dependencies?

This work provides the first in-depth analysis of genre in Universal Dep...