A Systematic Review on Affective Computing: Emotion Models, Databases, and Recent Advances

03/14/2022
by   Yan Wang, et al.
0

Affective computing plays a key role in human-computer interactions, entertainment, teaching, safe driving, and multimedia integration. Major breakthroughs have been made recently in the areas of affective computing (i.e., emotion recognition and sentiment analysis). Affective computing is realized based on unimodal or multimodal data, primarily consisting of physical information (e.g., textual, audio, and visual data) and physiological signals (e.g., EEG and ECG signals). Physical-based affect recognition caters to more researchers due to multiple public databases. However, it is hard to reveal one's inner emotion hidden purposely from facial expressions, audio tones, body gestures, etc. Physiological signals can generate more precise and reliable emotional results; yet, the difficulty in acquiring physiological signals also hinders their practical application. Thus, the fusion of physical information and physiological signals can provide useful features of emotional states and lead to higher accuracy. Instead of focusing on one specific field of affective analysis, we systematically review recent advances in the affective computing, and taxonomize unimodal affect recognition as well as multimodal affective analysis. Firstly, we introduce two typical emotion models followed by commonly used databases for affective computing. Next, we survey and taxonomize state-of-the-art unimodal affect recognition and multimodal affective analysis in terms of their detailed architectures and performances. Finally, we discuss some important aspects on affective computing and their applications and conclude this review with an indication of the most promising future directions, such as the establishment of baseline dataset, fusion strategies for multimodal affective analysis, and unsupervised learning models.

READ FULL TEXT

page 7

page 18

page 26

research
11/29/2019

Multimodal Emotion Recognition Model using Physiological Signals

As an important field of research in Human-Machine Interactions, emotion...
research
03/16/2021

Leveraging Recent Advances in Deep Learning for Audio-Visual Emotion Recognition

Emotional expressions are the behaviors that communicate our emotional s...
research
03/28/2019

A Multimodal Emotion Sensing Platform for Building Emotion-Aware Applications

Humans use a host of signals to infer the emotional state of others. In ...
research
05/31/2019

Multimodal Joint Emotion and Game Context Recognition in League of Legends Livestreams

Video game streaming provides the viewer with a rich set of audio-visual...
research
06/05/2023

Synthesizing Affective Neurophysiological Signals Using Generative Models: A Review Paper

The integration of emotional intelligence in machines is an important st...
research
01/03/2019

A Network-based Multimodal Data Fusion Approach for Characterizing Dynamic Multimodal Physiological Patterns

Characterizing the dynamic interactive patterns of complex systems helps...
research
05/18/2023

Expanding the Role of Affective Phenomena in Multimodal Interaction Research

In recent decades, the field of affective computing has made substantial...

Please sign up or login with your details

Forgot password? Click here to reset