Prompting Language Models for Linguistic Structure

11/15/2022
by   Terra Blevins, et al.
0

Although pretrained language models (PLMs) can be prompted to perform a wide range of language tasks, it remains an open question how much this ability comes from generalizable linguistic representations versus more surface-level lexical patterns. To test this, we present a structured prompting approach that can be used to prompt for linguistic structure prediction tasks, allowing us to perform zero- and few-shot sequence tagging with autoregressive PLMs. We evaluate this approach on part-of-speech tagging, named entity recognition, and sentence chunking and demonstrate strong few-shot performance in all cases. We also find that, though the surface forms of the tags provide some signal, structured prompting can retrieve linguistic structure even with arbitrary labels, indicating that PLMs contain this knowledge in a general manner robust to label choice.

READ FULL TEXT

page 6

page 7

research
10/26/2022

Autoregressive Structured Prediction with Language Models

Recent years have seen a paradigm shift in NLP towards using pretrained ...
research
03/16/2022

Geographic Adaptation of Pretrained Language Models

Geographic linguistic features are commonly used to improve the performa...
research
02/10/2022

Distilling Hypernymy Relations from Language Models: On the Effectiveness of Zero-Shot Taxonomy Induction

In this paper, we analyze zero-shot taxonomy learning methods which are ...
research
04/06/2023

TagGPT: Large Language Models are Zero-shot Multimodal Taggers

Tags are pivotal in facilitating the effective distribution of multimedi...
research
03/16/2022

Transforming Sequence Tagging Into A Seq2Seq Task

Pretrained, large, generative language models (LMs) have had great succe...
research
12/08/2022

Demystifying Prompts in Language Models via Perplexity Estimation

Language models can be prompted to perform a wide variety of zero- and f...

Please sign up or login with your details

Forgot password? Click here to reset