Which *BERT? A Survey Organizing Contextualized Encoders

10/02/2020
by   Patrick Xia, et al.
0

Pretrained contextualized text encoders are now a staple of the NLP community. We present a survey on language representation learning with the aim of consolidating a series of shared lessons learned across a variety of recent efforts. While significant advancements continue at a rapid pace, we find that enough has now been discovered, in different directions, that we can begin to organize advances according to common themes. Through this organization, we highlight important considerations when interpreting recent contributions and choosing which model to use.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/11/2021

FairFil: Contrastive Neural Debiasing Method for Pretrained Text Encoders

Pretrained text encoders, such as BERT, have been applied increasingly i...
research
06/18/2023

Advancing Biomedicine with Graph Representation Learning: Recent Progress, Challenges, and Future Directions

Graph representation learning (GRL) has emerged as a pivotal field that ...
research
03/29/2021

NLP for Ghanaian Languages

NLP Ghana is an open-source non-profit organization aiming to advance th...
research
01/15/2022

Automatic Lexical Simplification for Turkish

In this paper, we present the first automatic lexical simplification sys...
research
06/06/2023

On the Difference of BERT-style and CLIP-style Text Encoders

Masked language modeling (MLM) has been one of the most popular pretrain...
research
12/19/2022

MANTIS at TSAR-2022 Shared Task: Improved Unsupervised Lexical Simplification with Pretrained Encoders

In this paper we present our contribution to the TSAR-2022 Shared Task o...
research
03/30/2020

The current state of automated argumentation theory: a literature review

Automated negotiation can be an efficient method for resolving conflict ...

Please sign up or login with your details

Forgot password? Click here to reset