CrisisBERT: a Robust Transformer for Crisis Classification and Contextual Crisis Embedding

05/11/2020
by   Junhua Liu, et al.
0

Classification of crisis events, such as natural disasters, terrorist attacks and pandemics, is a crucial task to create early signals and inform relevant parties for spontaneous actions to reduce overall damage. Despite crisis such as natural disasters can be predicted by professional institutions, certain events are first signaled by civilians, such as the recent COVID-19 pandemics. Social media platforms such as Twitter often exposes firsthand signals on such crises through high volume information exchange over half a billion tweets posted daily. Prior works proposed various crisis embeddings and classification using conventional Machine Learning and Neural Network models. However, none of the works perform crisis embedding and classification using state of the art attention-based deep neural networks models, such as Transformers and document-level contextual embeddings. This work proposes CrisisBERT, an end-to-end transformer-based model for two crisis classification tasks, namely crisis detection and crisis recognition, which shows promising results across accuracy and f1 scores. The proposed model also demonstrates superior robustness over benchmark, as it shows marginal performance compromise while extending from 6 to 36 events with only 51.4 proposed Crisis2Vec, an attention-based, document-level contextual embedding architecture for crisis embedding, which achieve better performance than conventional crisis embedding methods such as Word2Vec and GloVe. To the best of our knowledge, our works are first to propose using transformer-based crisis classification and document-level contextual crisis embedding in the literature.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/11/2020

CrisisBERT: Robust Transformer for Crisis Classification and Contextual Crisis Embedding

Classification of crisis events, such as natural disasters, terrorist at...
research
04/27/2021

UoT-UWF-PartAI at SemEval-2021 Task 5: Self Attention Based Bi-GRU with Multi-Embedding Representation for Toxicity Highlighter

Toxic Spans Detection(TSD) task is defined as highlighting spans that ma...
research
09/08/2022

Transformer-based classification of premise in tweets related to COVID-19

Automation of social network data assessment is one of the classic chall...
research
08/29/2018

Attention-based Neural Text Segmentation

Text segmentation plays an important role in various Natural Language Pr...
research
12/11/2020

TabTransformer: Tabular Data Modeling Using Contextual Embeddings

We propose TabTransformer, a novel deep tabular data modeling architectu...
research
11/02/2020

The Devil is in the Details: Evaluating Limitations of Transformer-based Methods for Granular Tasks

Contextual embeddings derived from transformer-based neural language mod...
research
02/27/2023

Elementwise Language Representation

We propose a new technique for computational language representation cal...

Please sign up or login with your details

Forgot password? Click here to reset