CLEVE: Contrastive Pre-training for Event Extraction

05/30/2021
by   Ziqi Wang, et al.
0

Event extraction (EE) has considerably benefited from pre-trained language models (PLMs) by fine-tuning. However, existing pre-training methods have not involved modeling event characteristics, resulting in the developed EE models cannot take full advantage of large-scale unsupervised data. To this end, we propose CLEVE, a contrastive pre-training framework for EE to better learn event knowledge from large unsupervised data and their semantic structures (e.g. AMR) obtained with automatic parsers. CLEVE contains a text encoder to learn event semantics and a graph encoder to learn event structures respectively. Specifically, the text encoder learns event semantic representations by self-supervised contrastive learning to represent the words of the same events closer than those unrelated words; the graph encoder learns event structure representations by graph contrastive pre-training on parsed event-related semantic structures. The two complementary representations then work together to improve both the conventional supervised EE and the unsupervised "liberal" EE, which requires jointly extracting events and discovering event schemata without any annotated data. Experiments on ACE 2005 and MAVEN datasets show that CLEVE achieves significant improvements, especially in the challenging unsupervised setting. The source code and pre-trained checkpoints can be obtained from https://github.com/THU-KEG/CLEVE.

READ FULL TEXT
research
05/15/2022

Fine-tuning Pre-trained Language Models for Few-shot Intent Detection: Supervised Pre-training and Isotropization

It is challenging to train a good intent classifier for a task-oriented ...
research
03/15/2022

Graph Pre-training for AMR Parsing and Generation

Abstract meaning representation (AMR) highlights the core semantic infor...
research
12/23/2021

Data Augmentation based Consistency Contrastive Pre-training for Automatic Speech Recognition

Self-supervised acoustic pre-training has achieved amazing results on th...
research
01/03/2023

Policy Pre-training for End-to-end Autonomous Driving via Self-supervised Geometric Modeling

Witnessing the impressive achievements of pre-training techniques on lar...
research
06/11/2021

Hybrid Generative-Contrastive Representation Learning

Unsupervised representation learning has recently received lots of inter...
research
01/24/2022

Text and Code Embeddings by Contrastive Pre-Training

Text embeddings are useful features in many applications such as semanti...
research
02/11/2021

Unsupervised Semantic Segmentation by Contrasting Object Mask Proposals

Being able to learn dense semantic representations of images without sup...

Please sign up or login with your details

Forgot password? Click here to reset