Schema-aware Reference as Prompt Improves Data-Efficient Relational Triple and Event Extraction

10/19/2022
by   Yunzhi Yao, et al.
0

Information Extraction, which aims to extract structural relational triple or event from unstructured texts, often suffers from data scarcity issues. With the development of pre-trained language models, many prompt-based approaches to data-efficient information extraction have been proposed and achieved impressive performance. However, existing prompt learning methods for information extraction are still susceptible to several potential limitations: (i) semantic gap between natural language and output structure knowledge with pre-defined schema; (ii) representation learning with locally individual instances limits the performance given the insufficient features. In this paper, we propose a novel approach of schema-aware Reference As Prompt (RAP), which dynamically leverage schema and knowledge inherited from global (few-shot) training data for each sample. Specifically, we propose a schema-aware reference store, which unifies symbolic schema and relevant textual instances. Then, we employ a dynamic reference integration module to retrieve pertinent knowledge from the datastore as prompts during training and inference. Experimental results demonstrate that RAP can be plugged into various existing models and outperforms baselines in low-resource settings on four datasets of relational triple extraction and event extraction. In addition, we provide comprehensive empirical ablations and case analysis regarding different types and scales of knowledge in order to better understand the mechanisms of RAP. Code is available in https://github.com/zjunlp/RAP.

READ FULL TEXT
research
05/15/2023

Schema-adaptable Knowledge Graph Construction

Conventional Knowledge Graph Construction (KGC) approaches typically fol...
research
05/07/2023

From Unstructured to Structured: Transforming Chatbot Dialogues into Data Mart Schema for Visualization

Schools are among the primary avenues for public healthcare intervention...
research
03/23/2022

Unified Structure Generation for Universal Information Extraction

Information extraction suffers from its varying targets, heterogeneous s...
research
05/04/2022

P^3 Ranker: Mitigating the Gaps between Pre-training and Ranking Fine-tuning with Prompt-based Learning and Pre-finetuning

Compared to other language tasks, applying pre-trained language models (...
research
04/12/2021

Relational world knowledge representation in contextual language models: A review

Relational knowledge bases (KBs) are established tools for world knowled...
research
05/27/2023

A Diffusion Model for Event Skeleton Generation

Event skeleton generation, aiming to induce an event schema skeleton gra...
research
05/24/2023

Injecting Knowledge into Biomedical Pre-trained Models via Polymorphism and Synonymous Substitution

Pre-trained language models (PLMs) were considered to be able to store r...

Please sign up or login with your details

Forgot password? Click here to reset