Learning Unseen Emotions from Gestures via Semantically-Conditioned Zero-Shot Perception with Adversarial Autoencoders

09/18/2020
by   Abhishek Banerjee, et al.
8

We present a novel generalized zero-shot algorithm to recognize perceived emotions from gestures. Our task is to map gestures to novel emotion categories not encountered in training. We introduce an adversarial, autoencoder-based representation learning that correlates 3D motion-captured gesture sequence with the vectorized representation of the natural-language perceived emotion terms using word2vec embeddings. The language-semantic embedding provides a representation of the emotion label space, and we leverage this underlying distribution to map the gesture-sequences to the appropriate categorical emotion labels. We train our method using a combination of gestures annotated with known emotion terms and gestures not annotated with any emotions. We evaluate our method on the MPI Emotional Body Expressions Database (EBEDB) and obtain an accuracy of 58.43%. This improves the performance of current state-of-the-art algorithms for generalized zero-shot learning by 25–27% on the absolute.

READ FULL TEXT

page 1

page 2

research
10/13/2020

A Generalized Zero-Shot Framework for Emotion Recognition from Body Gestures

Although automatic emotion recognition from facial expressions and speec...
research
09/29/2020

A Prototype-Based Generalized Zero-Shot Learning Framework for Hand Gesture Recognition

Hand gesture recognition plays a significant role in human-computer inte...
research
01/26/2021

Text2Gestures: A Transformer-Based Network for Generating Emotive Body Gestures for Virtual Agents

We present Text2Gestures, a transformer-based learning method to interac...
research
09/14/2022

Natural Language Inference Prompts for Zero-shot Emotion Classification in Text across Corpora

Within textual emotion classification, the set of relevant labels depend...
research
09/21/2020

Modality-Transferable Emotion Embeddings for Low-Resource Multimodal Emotion Recognition

Despite the recent achievements made in the multi-modal emotion recognit...
research
11/20/2019

Take an Emotion Walk: Perceiving Emotions from Gaits Using Hierarchical Attention Pooling and Affective Mapping

We present an autoencoder-based semi-supervised approach to classify per...
research
07/01/2021

iMiGUE: An Identity-free Video Dataset for Micro-Gesture Understanding and Emotion Analysis

We introduce a new dataset for the emotional artificial intelligence res...

Please sign up or login with your details

Forgot password? Click here to reset