K-EmoCon, a multimodal sensor dataset for continuous emotion recognition in naturalistic conversations

05/08/2020
by   Cheul Young Park, et al.
12

Recognizing emotions during social interactions has many potential applications with the popularization of low-cost mobile sensors, but a challenge remains with the lack of naturalistic affective interaction data. Most existing emotion datasets do not support studying idiosyncratic emotions arising in the wild as they were collected in constrained environments. Therefore, studying emotions in the context of social interactions requires a novel dataset, and K-EmoCon is such a multimodal dataset with comprehensive annotations of continuous emotions during naturalistic conversations. The dataset contains multimodal measurements, including audiovisual recordings, EEG, and peripheral physiological signals, acquired with off-the-shelf devices from 16 sessions of approximately 10-minute long paired debates on a social issue. Distinct from previous datasets, it includes emotion annotations from all three available perspectives: self, debate partner, and external observers. Raters annotated emotional displays at intervals of every 5 seconds while viewing the debate footage, in terms of arousal-valence and 18 additional categorical emotions. The resulting K-EmoCon is the first publicly available emotion dataset accommodating the multiperspective assessment of emotions during social interactions.

READ FULL TEXT

page 6

page 12

page 13

research
12/03/2021

Shapes of Emotions: Multimodal Emotion Recognition in Conversations via Emotion Shifts

Emotion Recognition in Conversations (ERC) is an important and active re...
research
12/06/2018

A dataset of continuous affect annotations and physiological signals for emotion analysis

From a computational viewpoint, emotions continue to be intriguingly har...
research
07/14/2022

MDEAW: A Multimodal Dataset for Emotion Analysis through EDA and PPG signals from wireless wearable low-cost off-the-shelf Devices

We present MDEAW, a multimodal database consisting of Electrodermal Acti...
research
07/24/2023

Introducing CALMED: Multimodal Annotated Dataset for Emotion Detection in Children with Autism

Automatic Emotion Detection (ED) aims to build systems to identify users...
research
12/21/2022

Automatic Emotion Modelling in Written Stories

Telling stories is an integral part of human communication which can evo...
research
08/30/2019

The OMG-Empathy Dataset: Evaluating the Impact of Affective Behavior in Storytelling

Processing human affective behavior is important for developing intellig...
research
04/28/2020

Exploring the Contextual Dynamics of Multimodal Emotion Recognition in Videos

Emotional expressions form a key part of user behavior on today's digita...

Please sign up or login with your details

Forgot password? Click here to reset