Cutting Down on Prompts and Parameters: Simple Few-Shot Learning with Language Models

06/24/2021
by   Robert L. Logan IV, et al.
0

Prompting language models (LMs) with training examples and task descriptions has been seen as critical to recent successes in few-shot learning. In this work, we show that finetuning LMs in the few-shot setting can considerably reduce the need for prompt engineering. In fact, one can use null prompts, prompts that contain neither task-specific templates nor training examples, and achieve competitive accuracy to manually-tuned prompts across a wide range of tasks. While finetuning LMs does introduce new parameters for each downstream task, we show that this memory overhead can be substantially reduced: finetuning only the bias terms can achieve comparable or better accuracy than standard finetuning while only updating 0.1 recommend finetuning LMs for few-shot learning as it is more accurate, robust to different prompts, and can be made nearly as efficient as using frozen LMs.

READ FULL TEXT

page 4

page 5

page 6

06/03/2021

Reordering Examples Helps during Priming-based Few-Shot Learning

The ability to learn from limited data, or few-shot learning, is a desir...
02/19/2021

Calibrate Before Use: Improving Few-Shot Performance of Language Models

GPT-3 can perform numerous tasks when provided a natural language prompt...
08/05/2022

Few-shot Learning with Retrieval Augmented Language Models

Large language models have shown impressive few-shot results on a wide r...
05/24/2021

True Few-Shot Learning with Language Models

Pretrained language models (LMs) perform well on many tasks even when le...
03/29/2018

MemGEN: Memory is All You Need

We propose a new learning paradigm called Deep Memory. It has the potent...
05/22/2020

One of these (Few) Things is Not Like the Others

To perform well, most deep learning based image classification systems r...
05/20/2022

Prototypical Calibration for Few-shot Learning of Language Models

In-context learning of GPT-like models has been recognized as fragile ac...