DeepStruct: Pretraining of Language Models for Structure Prediction

05/21/2022
by   Chenguang Wang, et al.
0

We introduce a method for improving the structural understanding abilities of language models. Unlike previous approaches that finetune the models with task-specific augmentation, we pretrain language models on a collection of task-agnostic corpora to generate structures from text. Our structure pretraining enables zero-shot transfer of the learned knowledge that models have about the structure tasks. We study the performance of this approach on 28 datasets, spanning 10 structure prediction tasks including open information extraction, joint entity and relation extraction, named entity recognition, relation classification, semantic role labeling, event extraction, coreference resolution, factual probe, intent detection, and dialogue state tracking. We further enhance the pretraining with the task-specific training sets. We show that a 10B parameter language model transfers non-trivially to most tasks and obtains state-of-the-art performance on 21 of 28 datasets that we evaluate.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/14/2021

Structured Prediction as Translation between Augmented Natural Languages

We propose a new framework, Translation between Augmented Natural Langua...
research
04/09/2023

Are Large Language Models Ready for Healthcare? A Comparative Study on Clinical Language Understanding

Large language models (LLMs) have made significant progress in various d...
research
10/26/2022

Autoregressive Structured Prediction with Language Models

Recent years have seen a paradigm shift in NLP towards using pretrained ...
research
09/04/2023

Zero-shot information extraction from radiological reports using ChatGPT

Electronic health records contain an enormous amount of valuable informa...
research
05/24/2023

Large Language Models as Counterfactual Generator: Strengths and Weaknesses

Large language models (LLMs) have demonstrated remarkable performance in...
research
09/09/2020

Comparative Study of Language Models on Cross-Domain Data with Model Agnostic Explainability

With the recent influx of bidirectional contextualized transformer langu...
research
06/15/2022

Contextualization and Generalization in Entity and Relation Extraction

During the past decade, neural networks have become prominent in Natural...

Please sign up or login with your details

Forgot password? Click here to reset