Facial Expression and Peripheral Physiology Fusion to Decode Individualized Affective Experience

11/18/2018
by   Yu Yin, et al.
0

In this paper, we present a multimodal approach to simultaneously analyze facial movements and several peripheral physiological signals to decode individualized affective experiences under positive and negative emotional contexts, while considering their personalized resting dynamics. We propose a person-specific recurrence network to quantify the dynamics present in the person's facial movements and physiological data. Facial movement is represented using a robust head vs. 3D face landmark localization and tracking approach, and physiological data are processed by extracting known attributes related to the underlying affective experience. The dynamical coupling between different input modalities is then assessed through the extraction of several complex recurrent network metrics. Inference models are then trained using these metrics as features to predict individual's affective experience in a given context, after their resting dynamics are excluded from their response. We validated our approach using a multimodal dataset consists of (i) facial videos and (ii) several peripheral physiological signals, synchronously recorded from 12 participants while watching 4 emotion-eliciting video-based stimuli. The affective experience prediction results signified that our multimodal fusion method improves the prediction accuracy up to 19 compared to the prediction using only one or a subset of the input modalities. Furthermore, we gained prediction improvement for affective experience by considering the effect of individualized resting dynamics.

READ FULL TEXT
research
06/05/2023

Interpretable Multimodal Emotion Recognition using Facial Features and Physiological Signals

This paper aims to demonstrate the importance and feasibility of fusing ...
research
03/29/2022

An EEG-Based Multi-Modal Emotion Database with Both Posed and Authentic Facial Actions for Emotion Analysis

Emotion is an experience associated with a particular pattern of physiol...
research
01/03/2019

A Network-based Multimodal Data Fusion Approach for Characterizing Dynamic Multimodal Physiological Patterns

Characterizing the dynamic interactive patterns of complex systems helps...
research
05/04/2023

MEDIC: A Multimodal Empathy Dataset in Counseling

Although empathic interaction between counselor and client is fundamenta...
research
04/11/2023

A Deep Cybersickness Predictor through Kinematic Data with Encoded Physiological Representation

Users would experience individually different sickness symptoms during o...
research
04/03/2023

Wearable Sensor-based Multimodal Physiological Responses of Socially Anxious Individuals across Social Contexts

Correctly identifying an individual's social context from passively worn...
research
03/20/2020

Physiologically Driven Storytelling: Concept and Software Tool

We put forth Physiologically Driven Storytelling, a new approach to inte...

Please sign up or login with your details

Forgot password? Click here to reset