ShEMO -- A Large-Scale Validated Database for Persian Speech Emotion Detection

06/04/2019
by   Omid Mohamad Nezami, et al.
0

This paper introduces a large-scale, validated database for Persian called Sharif Emotional Speech Database (ShEMO). The database includes 3000 semi-natural utterances, equivalent to 3 hours and 25 minutes of speech data extracted from online radio plays. The ShEMO covers speech samples of 87 native-Persian speakers for five basic emotions including anger, fear, happiness, sadness and surprise, as well as neutral state. Twelve annotators label the underlying emotional state of utterances and majority voting is used to decide on the final labels. According to the kappa measure, the inter-annotator agreement is 64 agreement". We also present benchmark results based on common classification methods in speech emotion detection task. According to the experiments, support vector machine achieves the best results for both gender-independent (58.2 and gender-dependent models (female=59.4 in www.shemodb.com for academic purposes free of charge to provide a baseline for further research on Persian emotional speech.

READ FULL TEXT
research
01/09/2021

Spanish expressive voices: Corpus for emotion research in spanish

A new emotional multimedia database has been recorded and aligned. The d...
research
05/31/2021

Emotional Voice Conversion: Theory, Databases and ESD

In this paper, we first provide a review of the state-of-the-art emotion...
research
01/20/2018

Gender-dependent emotion recognition based on HMMs and SPHMMs

It is well known that emotion recognition performance is not ideal. The ...
research
02/03/2020

Speech Emotion Recognition using Support Vector Machine

In this project, we aim to classify the speech taken as one of the four ...
research
05/19/2021

SEMOUR: A Scripted Emotional Speech Repository for Urdu

Designing reliable Speech Emotion Recognition systems is a complex task ...
research
08/30/2018

Contribution of Glottal Waveform in Speech Emotion: A Comparative Pairwise Investigation

In this work, we investigated the contribution of the glottal waveform i...
research
06/15/2022

Accurate Emotion Strength Assessment for Seen and Unseen Speech Based on Data-Driven Deep Learning

Emotion classification of speech and assessment of the emotion strength ...

Please sign up or login with your details

Forgot password? Click here to reset