Attention-Based LSTM for Psychological Stress Detection from Spoken Language Using Distant Supervision

05/31/2018
by   Genta Indra Winata, et al.
0

We propose a Long Short-Term Memory (LSTM) with attention mechanism to classify psychological stress from self-conducted interview transcriptions. We apply distant supervision by automatically labeling tweets based on their hashtag content, which complements and expands the size of our corpus. This additional data is used to initialize the model parameters, and which it is fine-tuned using the interview data. This improves the model's robustness, especially by expanding the vocabulary size. The bidirectional LSTM model with attention is found to be the best model in terms of accuracy (74.1 f-score (74.3 enhances the model's performance by 1.6 attention mechanism helps the model to select informative words.

READ FULL TEXT
research
10/30/2018

Long Short-Term Attention

In order to learn effective features from temporal sequences, the long s...
research
11/26/2016

Attention-based Memory Selection Recurrent Network for Language Modeling

Recurrent neural networks (RNNs) have achieved great success in language...
research
05/20/2021

Bidirectional LSTM-CRF Attention-based Model for Chinese Word Segmentation

Chinese word segmentation (CWS) is the basic of Chinese natural language...
research
01/09/2018

Topical Stance Detection for Twitter: A Two-Phase LSTM Model Using Attention

The topical stance detection problem addresses detecting the stance of t...
research
05/26/2023

Detect Any Shadow: Segment Anything for Video Shadow Detection

Segment anything model (SAM) has achieved great success in the field of ...

Please sign up or login with your details

Forgot password? Click here to reset