Two-Stream Aural-Visual Affect Analysis in the Wild

02/09/2020
by   Felix Kuhnke, et al.
0

In this work we introduce our submission to the Affective Behavior Analysis in-the-wild (ABAW) 2020 competition. We propose a two-stream aural-visual analysis model based on spatial and temporal convolutions. Furthermore, we utilize additional visual features from face-alignment knowledge. During training we exploit correlations between different emotion representations to improve performance. Our model achieves promising results on the challenging Aff-Wild2 database.

READ FULL TEXT

page 2

page 3

research
04/29/2018

Deep Affect Prediction in-the-wild: Aff-Wild Database and Challenge, Deep Architectures, and Beyond

Automatic understanding of human affect using visual signals is of great...
research
03/24/2022

Multi-modal Emotion Estimation for in-the-wild Videos

In this paper, we briefly introduce our submission to the Valence-Arousa...
research
10/24/2019

Emotion recognition with 4kresolution database

Classifying the human emotion through facial expressions is a big topic ...
research
07/08/2021

Multitask Multi-database Emotion Recognition

In this work, we introduce our submission to the 2nd Affective Behavior ...
research
02/26/2020

Multi-Modal Continuous Valence And Arousal Prediction in the Wild Using Deep 3D Features and Sequence Modeling

Continuous affect prediction in the wild is a very interesting problem a...
research
07/08/2021

Technical Report for Valence-Arousal Estimation in ABAW2 Challenge

In this work, we describe our method for tackling the valence-arousal es...
research
02/03/2020

Adversarial-based neural networks for affect estimations in the wild

There is a growing interest in affective computing research nowadays giv...

Please sign up or login with your details

Forgot password? Click here to reset