Take an Emotion Walk: Perceiving Emotions from Gaits Using Hierarchical Attention Pooling and Affective Mapping

11/20/2019
by   Uttaran Bhattacharya, et al.
25

We present an autoencoder-based semi-supervised approach to classify perceived human emotions from walking styles obtained from videos or from motion-captured data and represented as sequences of 3D poses. Given the motion on each joint in the pose at each time step extracted from 3D pose sequences, we hierarchically pool these joint motions in a bottom-up manner in the encoder, following the kinematic chains in the human body. We also constrain the latent embeddings of the encoder to contain the space of psychologically-motivated affective features underlying the gaits. We train the decoder to reconstruct the motions per joint per time step in a top-down manner from the latent embeddings. For the annotated data, we also train a classifier to map the latent embeddings to emotion labels. Our semi-supervised approach achieves a mean average precision of 0.84 on the Emotion-Gait benchmark dataset, which contains gaits collected from multiple sources. We outperform current state-of-art algorithms for both emotion recognition and action recognition from 3D gaits by 7

READ FULL TEXT
research
06/14/2019

Identifying Emotions from Walking using Affective and Deep Features

We present a new data-driven model and algorithm to identify the perceiv...
research
03/02/2020

ProxEmo: Gait-based Emotion Learning and Multi-view Proxemic Fusion for Socially-Aware Robot Navigation

We present ProxEmo, a novel end-to-end emotion prediction algorithm for ...
research
10/30/2020

Pose-based Body Language Recognition for Emotion and Psychiatric Symptom Interpretation

Inspired by the human ability to infer emotions from body language, we p...
research
05/09/2021

Preserving Privacy in Human-Motion Affect Recognition

Human motion is a biomarker used extensively in clinical analysis to mon...
research
09/18/2020

Learning Unseen Emotions from Gestures via Semantically-Conditioned Zero-Shot Perception with Adversarial Autoencoders

We present a novel generalized zero-shot algorithm to recognize perceive...
research
10/28/2019

STEP: Spatial Temporal Graph Convolutional Networks for Emotion Perception from Gaits

We present a novel classifier network called STEP, to classify perceived...
research
10/04/2020

Generating Emotive Gaits for Virtual Agents Using Affect-Based Autoregression

We present a novel autoregression network to generate virtual agents tha...

Please sign up or login with your details

Forgot password? Click here to reset