Language Model Priming for Cross-Lingual Event Extraction

09/25/2021
by   Steven Fincke, et al.
0

We present a novel, language-agnostic approach to "priming" language models for the task of event extraction, providing particularly effective performance in low-resource and zero-shot cross-lingual settings. With priming, we augment the input to the transformer stack's language model differently depending on the question(s) being asked of the model at runtime. For instance, if the model is being asked to identify arguments for the trigger "protested", we will provide that trigger as part of the input to the language model, allowing it to produce different representations for candidate arguments than when it is asked about arguments for the trigger "arrest" elsewhere in the same sentence. We show that by enabling the language model to better compensate for the deficits of sparse and noisy training data, our approach improves both trigger and argument detection and classification significantly over the state of the art in a zero-shot cross-lingual setting.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/15/2022

Multilingual Generative Language Models for Zero-Shot Cross-Lingual Event Argument Extraction

We present a study on leveraging multilingual pre-trained generative lan...
research
05/24/2023

A Monte Carlo Language Model Pipeline for Zero-Shot Sociopolitical Event Extraction

We consider dyadic zero-shot event extraction (EE) to identify actions b...
research
09/10/2019

MultiFiT: Efficient Multi-lingual Language Model Fine-tuning

Pretrained language models are promising particularly for low-resource l...
research
04/04/2022

Aligned Weight Regularizers for Pruning Pretrained Neural Networks

While various avenues of research have been explored for iterative pruni...
research
05/17/2023

Massively Multi-Lingual Event Understanding: Extraction, Visualization, and Search

In this paper, we present ISI-Clear, a state-of-the-art, cross-lingual, ...
research
03/31/2017

One-Shot Neural Cross-Lingual Transfer for Paradigm Completion

We present a novel cross-lingual transfer method for paradigm completion...
research
10/13/2022

CLASP: Few-Shot Cross-Lingual Data Augmentation for Semantic Parsing

A bottleneck to developing Semantic Parsing (SP) models is the need for ...

Please sign up or login with your details

Forgot password? Click here to reset