The performance evaluation of Multi-representation in the Deep Learning models for Relation Extraction Task

Single implementing, concatenating, adding or replacing of the representations has yielded significant improvements on many NLP tasks. Mainly in Relation Extraction where static, contextualized and others representations that are capable of explaining word meanings through the linguistic features that these incorporates. In this work addresses the question of how is improved the relation extraction using different types of representations generated by pretrained language representation models. We benchmarked our approach using popular word representation models, replacing and concatenating static, contextualized and others representations of hand-extracted features. The experiments show that representation is a crucial element to choose when DL approach is applied. Word embeddings from Flair and BERT can be well interpreted by a deep learning model for RE task, and replacing static word embeddings with contextualized word representations could lead to significant improvements. While, the hand-created representations requires is time-consuming and not is ensure a improve in combination with others representations.


How Contextual are Contextualized Word Representations? Comparing the Geometry of BERT, ELMo, and GPT-2 Embeddings

Replacing static word embeddings with contextualized word representation...

Convolutional neural networks for chemical-disease relation extraction are improved with character-based word embeddings

We investigate the incorporation of character-based word representations...

Improved Relation Extraction with Feature-Rich Compositional Embedding Models

Compositional embedding models build a representation (or embedding) for...

Definition Frames: Using Definitions for Hybrid Concept Representations

Concept representations is a particularly active area in NLP. Although r...

LightRel SemEval-2018 Task 7: Lightweight and Fast Relation Classification

We present LightRel, a lightweight and fast relation classifier. Our goa...

Linguistic representations for fewer-shot relation extraction across domains

Recent work has demonstrated the positive impact of incorporating lingui...

Where's the Learning in Representation Learning for Compositional Semantics and the Case of Thematic Fit

Observing that for certain NLP tasks, such as semantic role prediction o...

Please sign up or login with your details

Forgot password? Click here to reset