Efficient transfer learning for NLP with ELECTRA

04/06/2021
by   François Mercier, et al.
0

Clark et al. [2020] claims that the ELECTRA approach is highly efficient in NLP performances relative to computation budget. As such, this reproducibility study focus on this claim, summarized by the following question: Can we use ELECTRA to achieve close to SOTA performances for NLP in low-resource settings, in term of compute cost?

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/16/2023

Reproducibility in NLP: What Have We Learned from the Checklist?

Scientific progress in NLP rests on the reproducibility of researchers' ...
research
03/29/2023

Adapting to the Low-Resource Double-Bind: Investigating Low-Compute Methods on Low-Resource African Languages

Many natural language processing (NLP) tasks make use of massively pre-t...
research
10/25/2022

This joke is [MASK]: Recognizing Humor and Offense with Prompting

Humor is a magnetic component in everyday human interactions and communi...
research
10/07/2020

Transfer Learning and Distant Supervision for Multilingual Transformer Models: A Study on African Languages

Multilingual transformer models like mBERT and XLM-RoBERTa have obtained...
research
07/14/2022

Learning to translate by learning to communicate

We formulate and test a technique to use Emergent Communication (EC) wit...
research
03/17/2021

The Human Evaluation Datasheet 1.0: A Template for Recording Details of Human Evaluation Experiments in NLP

This paper introduces the Human Evaluation Datasheet, a template for rec...
research
07/28/2023

Lessons in Reproducibility: Insights from NLP Studies in Materials Science

Natural Language Processing (NLP), a cornerstone field within artificial...

Please sign up or login with your details

Forgot password? Click here to reset