DeepAI AI Chat
Log In Sign Up

Which *BERT? A Survey Organizing Contextualized Encoders

10/02/2020
by   Patrick Xia, et al.
0

Pretrained contextualized text encoders are now a staple of the NLP community. We present a survey on language representation learning with the aim of consolidating a series of shared lessons learned across a variety of recent efforts. While significant advancements continue at a rapid pace, we find that enough has now been discovered, in different directions, that we can begin to organize advances according to common themes. Through this organization, we highlight important considerations when interpreting recent contributions and choosing which model to use.

READ FULL TEXT

page 1

page 2

page 3

page 4

03/11/2021

FairFil: Contrastive Neural Debiasing Method for Pretrained Text Encoders

Pretrained text encoders, such as BERT, have been applied increasingly i...
01/11/2019

Grammatical Analysis of Pretrained Sentence Encoders with Acceptability Judgments

Recent pretrained sentence encoders achieve state of the art results on ...
03/29/2021

NLP for Ghanaian Languages

NLP Ghana is an open-source non-profit organization aiming to advance th...
01/15/2022

Automatic Lexical Simplification for Turkish

In this paper, we present the first automatic lexical simplification sys...
05/19/2023

Deep Learning Approaches to Lexical Simplification: A Survey

Lexical Simplification (LS) is the task of replacing complex for simpler...
12/19/2022

MANTIS at TSAR-2022 Shared Task: Improved Unsupervised Lexical Simplification with Pretrained Encoders

In this paper we present our contribution to the TSAR-2022 Shared Task o...
03/30/2020

The current state of automated argumentation theory: a literature review

Automated negotiation can be an efficient method for resolving conflict ...