Verb Knowledge Injection for Multilingual Event Processing

12/31/2020
by   Olga Majewska, et al.
0

In parallel to their overwhelming success across NLP tasks, language ability of deep Transformer networks, pretrained via language modeling (LM) objectives has undergone extensive scrutiny. While probing revealed that these models encode a range of syntactic and semantic properties of a language, they are still prone to fall back on superficial cues and simple heuristics to solve downstream tasks, rather than leverage deeper linguistic knowledge. In this paper, we target one such area of their deficiency, verbal reasoning. We investigate whether injecting explicit information on verbs' semantic-syntactic behaviour improves the performance of LM-pretrained Transformers in event extraction tasks – downstream tasks for which accurate verb processing is paramount. Concretely, we impart the verb knowledge from curated lexical resources into dedicated adapter modules (dubbed verb adapters), allowing it to complement, in downstream tasks, the language knowledge obtained during LM-pretraining. We first demonstrate that injecting verb knowledge leads to performance gains in English event extraction. We then explore the utility of verb adapters for event extraction in other languages: we investigate (1) zero-shot language transfer with multilingual Transformers as well as (2) transfer via (noisy automatic) translation of English verb-based lexical constraints. Our results show that the benefits of verb knowledge injection indeed extend to other languages, even when verb adapters are trained on noisily translated constraints.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/01/2020

From Zero to Hero: On the Limitations of Zero-Shot Cross-Lingual Transfer with Multilingual Transformers

Massively multilingual transformers pretrained with language modeling ob...
research
11/10/2019

CamemBERT: a Tasty French Language Model

Pretrained language models are now ubiquitous in Natural Language Proces...
research
08/15/2020

Is Supervised Syntactic Parsing Beneficial for Language Understanding? An Empirical Investigation

Traditional NLP has long held (supervised) syntactic parsing necessary f...
research
08/01/2022

On the Limitations of Sociodemographic Adaptation with Transformers

Sociodemographic factors (e.g., gender or age) shape our language. Previ...
research
10/23/2020

Multilingual BERT Post-Pretraining Alignment

We propose a simple method to align multilingual contextual embeddings a...
research
07/31/2022

Neural Knowledge Bank for Pretrained Transformers

The ability of pretrained Transformers to remember factual knowledge is ...
research
10/13/2022

Can Demographic Factors Improve Text Classification? Revisiting Demographic Adaptation in the Age of Transformers

Demographic factors (e.g., gender or age) shape our language. Previous w...

Please sign up or login with your details

Forgot password? Click here to reset