DeepAI AI Chat
Log In Sign Up

Which *BERT? A Survey Organizing Contextualized Encoders

by   Patrick Xia, et al.

Pretrained contextualized text encoders are now a staple of the NLP community. We present a survey on language representation learning with the aim of consolidating a series of shared lessons learned across a variety of recent efforts. While significant advancements continue at a rapid pace, we find that enough has now been discovered, in different directions, that we can begin to organize advances according to common themes. Through this organization, we highlight important considerations when interpreting recent contributions and choosing which model to use.


page 1

page 2

page 3

page 4


FairFil: Contrastive Neural Debiasing Method for Pretrained Text Encoders

Pretrained text encoders, such as BERT, have been applied increasingly i...

Grammatical Analysis of Pretrained Sentence Encoders with Acceptability Judgments

Recent pretrained sentence encoders achieve state of the art results on ...

NLP for Ghanaian Languages

NLP Ghana is an open-source non-profit organization aiming to advance th...

Automatic Lexical Simplification for Turkish

In this paper, we present the first automatic lexical simplification sys...

Deep Learning Approaches to Lexical Simplification: A Survey

Lexical Simplification (LS) is the task of replacing complex for simpler...

MANTIS at TSAR-2022 Shared Task: Improved Unsupervised Lexical Simplification with Pretrained Encoders

In this paper we present our contribution to the TSAR-2022 Shared Task o...

The current state of automated argumentation theory: a literature review

Automated negotiation can be an efficient method for resolving conflict ...