Event Presence Prediction Helps Trigger Detection Across Languages

09/15/2020
by   Parul Awasthy, et al.
0

The task of event detection and classification is central to most information retrieval applications. We show that a Transformer based architecture can effectively model event extraction as a sequence labeling task. We propose a combination of sentence level and token level training objectives that significantly boosts the performance of a BERT based event extraction model. Our approach achieves a new state-of-the-art performance on ACE 2005 data for English and Chinese. We also test our model on ERE Spanish, achieving an average gain of 2 absolute F1 points over prior best performing model.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/30/2023

Token-Event-Role Structure-based Multi-Channel Document-Level Event Extraction

Document-level event extraction is a long-standing challenging informati...
research
10/24/2020

Efficient End-to-end Learning of Cross-event Dependencies for Document-level Event Extraction

Document-level event extraction is important for indexing the most impor...
research
10/07/2020

OpenIE6: Iterative Grid Labeling and Coordination Analysis for Open Information Extraction

A recent state-of-the-art neural open information extraction (OpenIE) sy...
research
05/17/2020

IMoJIE: Iterative Memory-Based Joint Open Information Extraction

While traditional systems for Open Information Extraction were statistic...
research
10/04/2018

Italian Event Detection Goes Deep Learning

This paper reports on a set of experiments with different word embedding...
research
10/13/2020

RGCL at SemEval-2020 Task 6: Neural Approaches to Definition Extraction

This paper presents the RGCL team submission to SemEval 2020 Task 6: Def...
research
02/19/2021

Back to Prior Knowledge: Joint Event Causality Extraction via Convolutional Semantic Infusion

Joint event and causality extraction is a challenging yet essential task...

Please sign up or login with your details

Forgot password? Click here to reset