Exploiting Cloze Questions for Few-Shot Text Classification and Natural Language Inference

01/21/2020
by   Timo Schick, et al.
0

Some NLP tasks can be solved in a fully unsupervised fashion by providing a pretrained language model with "task descriptions" in natural language (e.g., Radford et al., 2019). While this approach underperforms its supervised counterpart, we show in this work that the two ideas can be combined: We introduce Pattern-Exploiting Training (PET), a semi-supervised training procedure that reformulates input examples as cloze-style phrases which help the language model understand the given task. Theses phrases are then used to assign soft labels to a large set of unlabeled examples. Finally, regular supervised training is performed on the resulting training set. On several tasks, we show that PET outperforms both supervised training and unsupervised approaches in low-resource settings by a large margin.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/22/2020

Few-Shot Text Generation with Pattern-Exploiting Training

Providing pretrained language models with simple task descriptions or pr...
research
04/29/2022

Prompt Consistency for Zero-Shot Task Generalization

One of the most impressive results of recent NLP history is the ability ...
research
05/27/2020

Self-Training for Unsupervised Parsing with PRPN

Neural unsupervised parsing (UP) models learn to parse without access to...
research
01/27/2022

Grad2Task: Improved Few-shot Text Classification Using Gradients for Task Representation

Large pretrained language models (LMs) like BERT have improved performan...
research
09/14/2021

Task-adaptive Pre-training and Self-training are Complementary for Natural Language Understanding

Task-adaptive pre-training (TAPT) and Self-training (ST) have emerged as...
research
10/22/2020

slimIPL: Language-Model-Free Iterative Pseudo-Labeling

Recent results in end-to-end ASR have demonstrated the efficacy of simpl...
research
08/25/2023

EntropyRank: Unsupervised Keyphrase Extraction via Side-Information Optimization for Language Model-based Text Compression

We propose an unsupervised method to extract keywords and keyphrases fro...

Please sign up or login with your details

Forgot password? Click here to reset