Assessing BERT's Syntactic Abilities

01/16/2019
by   Yoav Goldberg, et al.
0

I assess the extent to which the recently introduced BERT model captures English syntactic phenomena, using (1) naturally-occurring subject-verb agreement stimuli; (2) "coloreless green ideas" subject-verb agreement stimuli, in which content words in natural sentences are randomly replaced with words sharing the same part-of-speech and inflection; and (3) manually crafted stimuli for subject-verb agreement and reflexive anaphora phenomena. The BERT model performs remarkably well on all cases.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/14/2022

Does BERT really agree ? Fine-grained Analysis of Lexical Dependence on a Syntactic Task

Although transformer-based Neural Language Models demonstrate impressive...
research
12/23/2019

Probing the phonetic and phonological knowledge of tones in Mandarin TTS models

This study probes the phonetic and phonological knowledge of lexical ton...
research
03/02/2022

Discontinuous Constituency and BERT: A Case Study of Dutch

In this paper, we set out to quantify the syntactic capacity of BERT in ...
research
08/26/2019

Does BERT agree? Evaluating knowledge of structure dependence through agreement relations

Learning representations that accurately model semantics is an important...
research
10/28/2022

Probing for targeted syntactic knowledge through grammatical error detection

Targeted studies testing knowledge of subject-verb agreement (SVA) indic...
research
12/08/2022

Assessing the Capacity of Transformer to Abstract Syntactic Representations: A Contrastive Analysis Based on Long-distance Agreement

The long-distance agreement, evidence for syntactic structure, is increa...
research
11/02/2020

Abstracting Influence Paths for Explaining (Contextualization of) BERT Models

While "attention is all you need" may be proving true, we do not yet kno...

Please sign up or login with your details

Forgot password? Click here to reset