AdaPrompt: Adaptive Prompt-based Finetuning for Relation Extraction

by   Xiang Chen, et al.

In this paper, we reformulate the relation extraction task as mask language modeling and propose a novel adaptive prompt-based finetuning approach. We propose an adaptive label words selection mechanism that scatters the relation label into variable number of label tokens to handle the complex multiple label space. We further introduce an auxiliary entity discriminator object to encourage the model to focus on context representation learning. Extensive experiments on benchmark datasets demonstrate that our approach can achieve better performance on both the few-shot and supervised setting.


RESIDE: Improving Distantly-Supervised Neural Relation Extraction using Side Information

Distantly-supervised Relation Extraction (RE) methods train an extractor...

Adaptive Prototypical Networks with Label Words and Joint Representation Learning for Few-Shot Relation Classification

Relation classification (RC) task is one of fundamental tasks of informa...

RelationPrompt: Leveraging Prompts to Generate Synthetic Data for Zero-Shot Relation Triplet Extraction

Despite the importance of relation extraction in building and representi...

Heterogeneous Supervision for Relation Extraction: A Representation Learning Approach

Relation extraction is a fundamental task in information extraction. Mos...

Good Visual Guidance Makes A Better Extractor: Hierarchical Visual Prefix for Multimodal Entity and Relation Extraction

Multimodal named entity recognition and relation extraction (MNER and MR...

Speaker-Oriented Latent Structures for Dialogue-Based Relation Extraction

Dialogue-based relation extraction (DiaRE) aims to detect the structural...

Exploring Task Difficulty for Few-Shot Relation Extraction

Few-shot relation extraction (FSRE) focuses on recognizing novel relatio...