Controlling for Confounders in Multimodal Emotion Classification via Adversarial Learning

08/23/2019
by   Mimansa Jaiswal, et al.
0

Various psychological factors affect how individuals express emotions. Yet, when we collect data intended for use in building emotion recognition systems, we often try to do so by creating paradigms that are designed just with a focus on eliciting emotional behavior. Algorithms trained with these types of data are unlikely to function outside of controlled environments because our emotions naturally change as a function of these other factors. In this work, we study how the multimodal expressions of emotion change when an individual is under varying levels of stress. We hypothesize that stress produces modulations that can hide the true underlying emotions of individuals and that we can make emotion recognition algorithms more generalizable by controlling for variations in stress. To this end, we use adversarial networks to decorrelate stress modulations from emotion representations. We study how stress alters acoustic and lexical emotional predictions, paying special attention to how modulations due to stress affect the transferability of learned emotion recognition models across domains. Our results show that stress is indeed encoded in trained emotion classifiers and that this encoding varies across levels of emotions and across the lexical and acoustic modalities. Our results also show that emotion recognition models that control for stress during training have better generalizability when applied to new domains, compared to models that do not control for stress during training. We conclude that is is necessary to consider the effect of extraneous psychological factors when building and testing emotion recognition models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/06/2023

Implicit Design Choices and Their Impact on Emotion Recognition Model Development and Evaluation

Emotion recognition is a complex task due to the inherent subjectivity i...
research
05/17/2021

MUSER: MUltimodal Stress Detection using Emotion Recognition as an Auxiliary Task

The capability to automatically detect human stress can benefit artifici...
research
01/08/2020

Emo-CNN for Perceiving Stress from Audio Signals: A Brain Chemistry Approach

Emotion plays a key role in many applications like healthcare, to gather...
research
04/28/2020

The Contextual Dynamics of Multimodal Emotion Recognition in Videos

Emotional expressions form a key part of user behavior on today's digita...
research
04/28/2020

Exploring the Contextual Dynamics of Multimodal Emotion Recognition in Videos

Emotional expressions form a key part of user behavior on today's digita...
research
06/24/2019

Multimodal and Multi-view Models for Emotion Recognition

Studies on emotion recognition (ER) show that combining lexical and acou...
research
10/26/2021

DASentimental: Detecting depression, anxiety and stress in texts via emotional recall, cognitive networks and machine learning

Most current affect scales and sentiment analysis on written text focus ...

Please sign up or login with your details

Forgot password? Click here to reset