A Novel Loss Function Utilizing Wasserstein Distance to Reduce Subject-Dependent Noise for Generalizable Models in Affective Computing

08/17/2023
by   Nibraas Khan, et al.
0

Emotions are an essential part of human behavior that can impact thinking, decision-making, and communication skills. Thus, the ability to accurately monitor and identify emotions can be useful in many human-centered applications such as behavioral training, tracking emotional well-being, and development of human-computer interfaces. The correlation between patterns in physiological data and affective states has allowed for the utilization of deep learning techniques which can accurately detect the affective states of a person. However, the generalisability of existing models is often limited by the subject-dependent noise in the physiological data due to variations in a subject's reactions to stimuli. Hence, we propose a novel cost function that employs Optimal Transport Theory, specifically Wasserstein Distance, to scale the importance of subject-dependent data such that higher importance is assigned to patterns in data that are common across all participants while decreasing the importance of patterns that result from subject-dependent noise. The performance of the proposed cost function is demonstrated through an autoencoder with a multi-class classifier attached to the latent space and trained simultaneously to detect different affective states. An autoencoder with a state-of-the-art loss function i.e., Mean Squared Error, is used as a baseline for comparison with our model across four different commonly used datasets. Centroid and minimum distance between different classes are used as a metrics to indicate the separation between different classes in the latent space. An average increase of 14.75 loss function) was found for minimum and centroid euclidean distance respectively over all datasets.

READ FULL TEXT

page 7

page 8

research
10/30/2017

Learning to solve inverse problems using Wasserstein loss

We propose using the Wasserstein loss for training in inverse problems. ...
research
11/16/2021

Ocean Mover's Distance: Using Optimal Transport for Analyzing Oceanographic Data

Modern ocean datasets are large, multi-dimensional, and inherently spati...
research
10/01/2019

Training Generative Networks with general Optimal Transport distances

We propose a new algorithm that uses an auxiliary Neural Network to calc...
research
08/26/2023

Optimal Transport-inspired Deep Learning Framework for Slow-Decaying Problems: Exploiting Sinkhorn Loss and Wasserstein Kernel

Reduced order models (ROMs) are widely used in scientific computing to t...
research
10/15/2018

Supervised COSMOS Autoencoder: Learning Beyond the Euclidean Loss!

Autoencoders are unsupervised deep learning models used for learning rep...
research
05/05/2021

Exploring emotional prototypes in a high dimensional TTS latent space

Recent TTS systems are able to generate prosodically varied and realisti...

Please sign up or login with your details

Forgot password? Click here to reset