DeepAI AI Chat
Log In Sign Up

Grammatical Analysis of Pretrained Sentence Encoders with Acceptability Judgments

by   Alex Warstadt, et al.
NYU college

Recent pretrained sentence encoders achieve state of the art results on language understanding tasks, but does this mean they have implicit knowledge of syntactic structures? We introduce a grammatically annotated development set for the Corpus of Linguistic Acceptability (CoLA; Warstadt et al., 2018), which we use to investigate the grammatical knowledge of three pretrained encoders, including the popular OpenAI Transformer (Radford et al., 2018) and BERT (Devlin et al., 2018). We fine-tune these encoders to do acceptability classification over CoLA and compare the models' performance on the annotated analysis set. Some phenomena, e.g. modification by adjuncts, are easy to learn for all models, while others, e.g. long-distance movement, are learned effectively only by models with strong overall performance, and others still, e.g. morphological agreement, are hardly learned by any model.


Real-time Inference in Multi-sentence Tasks with Deep Pretrained Transformers

The use of deep pretrained bidirectional transformers has led to remarka...

Asking without Telling: Exploring Latent Ontologies in Contextual Representations

The success of pretrained contextual encoders, such as ELMo and BERT, ha...

Infusing Finetuning with Semantic Dependencies

For natural language processing systems, two kinds of evidence support t...

Which *BERT? A Survey Organizing Contextualized Encoders

Pretrained contextualized text encoders are now a staple of the NLP comm...

What do you learn from context? Probing for sentence structure in contextualized word representations

Contextualized representation models such as ELMo (Peters et al., 2018a)...

Shortcut-Stacked Sentence Encoders for Multi-Domain Inference

We present a simple sequential sentence encoder for multi-domain natural...

Finetuning for Sarcasm Detection with a Pruned Dataset

Sarcasm is a form of irony that involves saying or writing something tha...