Deep Learning and Word Embeddings for Tweet Classification for Crisis Response

03/26/2019
by   Reem ALRashdi, et al.
0

Tradition tweet classification models for crisis response focus on convolutional layers and domain-specific word embeddings. In this paper, we study the application of different neural networks with general-purpose and domain-specific word embeddings to investigate their ability to improve the performance of tweet classification models. We evaluate four tweet classification models on CrisisNLP dataset and obtain comparable results which indicates that general-purpose word embedding such as GloVe can be used instead of domain-specific word embedding especially with Bi-LSTM where results reported the highest performance of 62.04

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/06/2022

Domain-Specific Word Embeddings with Structure Prediction

Complementary to finding good general word embeddings, an important ques...
research
05/25/2018

Lifelong Domain Word Embedding via Meta-Learning

Learning high-quality domain word embeddings is important for achieving ...
research
02/13/2023

Evaluation of Word Embeddings for the Social Sciences

Word embeddings are an essential instrument in many NLP tasks. Most avai...
research
08/06/2021

Simple Modifications to Improve Tabular Neural Networks

There is growing interest in neural network architectures for tabular da...
research
10/16/2019

A Probabilistic Framework for Learning Domain Specific Hierarchical Word Embeddings

The meaning of a word often varies depending on its usage in different d...
research
08/16/2019

Shallow Domain Adaptive Embeddings for Sentiment Analysis

This paper proposes a way to improve the performance of existing algorit...
research
08/07/2023

Vocab-Expander: A System for Creating Domain-Specific Vocabularies Based on Word Embeddings

In this paper, we propose Vocab-Expander at https://vocab-expander.com, ...

Please sign up or login with your details

Forgot password? Click here to reset