Deep Affect Prediction in-the-wild: Aff-Wild Database and Challenge, Deep Architectures, and Beyond

04/29/2018
by   Dimitrios Kollias, et al.
0

Automatic understanding of human affect using visual signals is of great importance in everyday human-machine interactions. Appraising human emotional states, behaviors and reactions displayed in real-world settings, can be accomplished using latent continuous dimensions (e.g., the circumplex model of affect). Valence (i.e., how positive or negative is an emotion) and arousal (i.e., power of the activation of the emotion) constitute the most popular and effective affect representations. Nevertheless, the majority of collected datasets this far, although containing naturalistic emotional states, have been captured in highly controlled recording conditions. In this paper, we introduce the Aff-Wild benchmark for training and evaluating affect recognition algorithms. We also report on the results of the First Affect-in-the-wild Challenge (Aff-Wild Challenge) that was recently organized on the Aff-Wild database, and was the first ever challenge on the estimation of valence and arousal in-the-wild. Furthermore, we design and extensively train an end-to-end deep neural architecture which performs prediction of continuous emotion dimensions based on visual cues. The proposed deep learning architecture, AffWildNet, includes convolutional and recurrent neural network (CNN-RNN) layers, exploiting the invariant properties of convolutional features, while also modeling temporal dynamics that arise in human behavior via the recurrent layers. The AffWildNet produced state-of-the-art results on the Aff-Wild Challenge. We then exploit the AffWild database for learning features, which can be used as priors for achieving best performances both for dimensional, as well as categorical emotion recognition, using the RECOLA, AFEW-VA and EmotiW 2017 datasets, compared to all other methods designed for the same goal.

READ FULL TEXT

page 4

page 5

page 6

page 7

page 8

page 14

page 15

page 16

research
11/11/2018

Aff-Wild2: Extending the Aff-Wild Database for Affect Recognition

Automatic understanding of human affect using visual signals is a proble...
research
10/24/2019

Emotion recognition with 4kresolution database

Classifying the human emotion through facial expressions is a big topic ...
research
03/24/2022

Continuous-Time Audiovisual Fusion with Recurrence vs. Attention for In-The-Wild Affect Recognition

In this paper, we present our submission to 3rd Affective Behavior Analy...
research
02/09/2020

Two-Stream Aural-Visual Affect Analysis in the Wild

In this work we introduce our submission to the Affective Behavior Analy...
research
08/23/2017

Statistical Selection of CNN-Based Audiovisual Features for Instantaneous Estimation of Human Emotional States

Automatic prediction of continuous-level emotional state requires select...
research
07/16/2019

End-To-End Prediction of Emotion From Heartbeat Data Collected by a Consumer Fitness Tracker

Automatic detection of emotion has the potential to revolutionize mental...
research
09/21/2022

Dynamic Time-Alignment of Dimensional Annotations of Emotion using Recurrent Neural Networks

Most automatic emotion recognition systems exploit time-continuous annot...

Please sign up or login with your details

Forgot password? Click here to reset