LabelPrompt: Effective Prompt-based Learning for Relation Classification

02/16/2023
by   Wenjie Zhang, et al.
0

Recently, prompt-based learning has become a very popular solution in many Natural Language Processing (NLP) tasks by inserting a template into model input, which converts the task into a cloze-style one to smoothing out differences between the Pre-trained Language Model (PLM) and the current task. But in the case of relation classification, it is difficult to map the masked output to the relation labels because of its abundant semantic information, e.g. org:founded_by”. Therefore, a pre-trained model still needs enough labelled data to fit the relations. To mitigate this challenge, in this paper, we present a novel prompt-based learning method, namely LabelPrompt, for the relation classification task. It is an extraordinary intuitive approach by a motivation: “GIVE MODEL CHOICES!”. First, we define some additional tokens to represent the relation labels, which regards these tokens as the verbalizer with semantic initialisation and constructs them with a prompt template method. Then we revisit the inconsistency of the predicted relation and the given entities, an entity-aware module with the thought of contrastive learning is designed to mitigate the problem. At last, we apply an attention query strategy to self-attention layers to resolve two types of tokens, prompt tokens and sequence tokens. The proposed strategy effectively improves the adaptation capability of prompt-based learning in the relation classification task when only a small labelled data is available. Extensive experimental results obtained on several bench-marking datasets demonstrate the superiority of the proposed LabelPrompt method, particularly in the few-shot scenario.

READ FULL TEXT
research
05/20/2019

Enriching Pre-trained Language Model with Entity Information for Relation Classification

Relation classification is an important NLP task to extract relations be...
research
03/02/2022

Theoretical Foundation of Colored Petri Net through an Analysis of their Markings as Multi-classification

Barwise and Seligman stated the first principle of information flow: "In...
research
01/23/2019

Semantic Relation Classification via Bidirectional LSTM Networks with Entity-aware Attention using Latent Entity Typing

Classifying semantic relations between entity pairs in sentences is an i...
research
12/16/2020

R^2-Net: Relation of Relation Learning Network for Sentence Semantic Matching

Sentence semantic matching is one of the fundamental tasks in natural la...
research
03/09/2022

HealthPrompt: A Zero-shot Learning Paradigm for Clinical Natural Language Processing

Deep learning algorithms are dependent on the availability of large-scal...
research
02/13/2023

Distinguishability Calibration to In-Context Learning

Recent years have witnessed increasing interests in prompt-based learnin...
research
07/10/2020

BISON:BM25-weighted Self-Attention Framework for Multi-Fields Document Search

Recent breakthrough in natural language processing has advanced the info...

Please sign up or login with your details

Forgot password? Click here to reset