Generating Emotive Gaits for Virtual Agents Using Affect-Based Autoregression

10/04/2020
by   Uttaran Bhattacharya, et al.
0

We present a novel autoregression network to generate virtual agents that convey various emotions through their walking styles or gaits. Given the 3D pose sequences of a gait, our network extracts pertinent movement features and affective features from the gait. We use these features to synthesize subsequent gaits such that the virtual agents can express and transition between emotions represented as combinations of happy, sad, angry, and neutral. We incorporate multiple regularizations in the training of our network to simultaneously enforce plausible movements and noticeable emotions on the virtual agents. We also integrate our approach with an AR environment using a Microsoft HoloLens and can generate emotive gaits at interactive rates to increase the social presence. We evaluate how human observers perceive both the naturalness and the emotions from the generated gaits of the virtual agents in a web-based study. Our results indicate around 89 naturalness of the gaits satisfactory on a five-point Likert scale, and the emotions they perceived from the virtual agents are statistically similar to the intended emotions of the virtual agents. We also use our network to augment existing gait datasets with emotive gaits and will release this augmented dataset for future research in emotion prediction and emotive gait synthesis. Our project website is available at https://gamma.umd.edu/gen_emotive_gaits/.

READ FULL TEXT

page 1

page 10

research
07/03/2019

EVA: Generating Emotional Behavior of Virtual Agents using Expressive Features of Gait and Gaze

We present a novel, real-time algorithm, EVA, for generating virtual age...
research
06/30/2019

FVA: Modeling Perceived Friendliness of Virtual Agents Using Movement Characteristics

We present a new approach for improving the friendliness and warmth of a...
research
10/28/2019

STEP: Spatial Temporal Graph Convolutional Networks for Emotion Perception from Gaits

We present a novel classifier network called STEP, to classify perceived...
research
01/26/2021

Text2Gestures: A Transformer-Based Network for Generating Emotive Body Gestures for Virtual Agents

We present Text2Gestures, a transformer-based learning method to interac...
research
11/20/2019

Take an Emotion Walk: Perceiving Emotions from Gaits Using Hierarchical Attention Pooling and Affective Mapping

We present an autoencoder-based semi-supervised approach to classify per...
research
01/03/2023

e-Inu: Simulating A Quadruped Robot With Emotional Sentience

Quadruped robots are currently used in industrial robotics as mechanical...

Please sign up or login with your details

Forgot password? Click here to reset