How does a Pre-Trained Transformer Integrate Contextual Keywords? Application to Humanitarian Computing

11/07/2021
by   Barriere Valentin, et al.
0

In a classification task, dealing with text snippets and metadata usually requires dealing with multimodal approaches. When those metadata are textual, it is tempting to use them intrinsically with a pre-trained transformer, in order to leverage the semantic information encoded inside the model. This paper describes how to improve a humanitarian classification task by adding the crisis event type to each tweet to be classified. Based on additional experiments of the model weights and behavior, it identifies how the proposed neural network approach is partially over-fitting the particularities of the Crisis Benchmark, to better highlight how the model is still undoubtedly learning to use and take advantage of the metadata's textual semantics.

READ FULL TEXT
research
06/01/2023

Boosting the Performance of Transformer Architectures for Semantic Textual Similarity

Semantic textual similarity is the task of estimating the similarity bet...
research
01/31/2023

ZhichunRoad at Amazon KDD Cup 2022: MultiTask Pre-Training for E-Commerce Product Search

In this paper, we propose a robust multilingual model to improve the qua...
research
09/12/2021

TEASEL: A Transformer-Based Speech-Prefixed Language Model

Multimodal language analysis is a burgeoning field of NLP that aims to s...
research
02/08/2023

Prompting for Multimodal Hateful Meme Classification

Hateful meme classification is a challenging multimodal task that requir...
research
07/21/2020

newsSweeper at SemEval-2020 Task 11: Context-Aware Rich Feature Representations For Propaganda Classification

This paper describes our submissions to SemEval 2020 Task 11: Detection ...
research
05/23/2022

Supporting Vision-Language Model Inference with Causality-pruning Knowledge Prompt

Vision-language models are pre-trained by aligning image-text pairs in a...
research
10/23/2020

Robust Document Representations using Latent Topics and Metadata

Task specific fine-tuning of a pre-trained neural language model using a...

Please sign up or login with your details

Forgot password? Click here to reset