iMiGUE: An Identity-free Video Dataset for Micro-Gesture Understanding and Emotion Analysis

by   Xin Liu, et al.

We introduce a new dataset for the emotional artificial intelligence research: identity-free video dataset for Micro-Gesture Understanding and Emotion analysis (iMiGUE). Different from existing public datasets, iMiGUE focuses on nonverbal body gestures without using any identity information, while the predominant researches of emotion analysis concern sensitive biometric data, like face and speech. Most importantly, iMiGUE focuses on micro-gestures, i.e., unintentional behaviors driven by inner feelings, which are different from ordinary scope of gestures from other gesture datasets which are mostly intentionally performed for illustrative purposes. Furthermore, iMiGUE is designed to evaluate the ability of models to analyze the emotional states by integrating information of recognized micro-gesture, rather than just recognizing prototypes in the sequences separately (or isolatedly). This is because the real need for emotion AI is to understand the emotional states behind gestures in a holistic way. Moreover, to counter for the challenge of imbalanced sample distribution of this dataset, an unsupervised learning method is proposed to capture latent representations from the micro-gesture sequences themselves. We systematically investigate representative methods on this dataset, and comprehensive experimental results reveal several interesting insights from the iMiGUE, e.g., micro-gesture-based analysis can promote emotion understanding. We confirm that the new iMiGUE dataset could advance studies of micro-gesture and emotion AI.


page 1

page 4

page 7


Survey on Emotional Body Gesture Recognition

Automatic emotion recognition has become a trending research topic in th...

A Generalized Zero-Shot Framework for Emotion Recognition from Body Gestures

Although automatic emotion recognition from facial expressions and speec...

Mugeetion: Musical Interface Using Facial Gesture and Emotion

People feel emotions when listening to music. However, emotions are not ...

Learning Unseen Emotions from Gestures via Semantically-Conditioned Zero-Shot Perception with Adversarial Autoencoders

We present a novel generalized zero-shot algorithm to recognize perceive...

Holoscopic 3D Micro-Gesture Database for Wearable Device Interaction

With the rapid development of augmented reality (AR) and virtual reality...

A Non-Anatomical Graph Structure for isolated hand gesture separation in continuous gesture sequences

Continuous Hand Gesture Recognition (CHGR) has been extensively studied ...

Tracking Ensemble Performance on Touch-Screens with Gesture Classification and Transition Matrices

We present and evaluate a novel interface for tracking ensemble performa...