Quantitative analysis of robot gesticulation behavior

10/22/2020
by   Unai Zabala, et al.
0

Social robot capabilities, such as talking gestures, are best produced using data driven approaches to avoid being repetitive and to show trustworthiness. However, there is a lack of robust quantitative methods that allow to compare such methods beyond visual evaluation. In this paper a quantitative analysis is performed that compares two Generative Adversarial Networks based gesture generation approaches. The aim is to measure characteristics such as fidelity to the original training data, but at the same time keep track of the degree of originality of the produced gestures. Principal Coordinate Analysis and procrustes statistics are performed and a new Fréchet Gesture Distance is proposed by adapting the Fréchet Inception Distance to gestures. These three techniques are taken together to asses the fidelity/originality of the generated gestures.

READ FULL TEXT
research
06/28/2021

Speech2Properties2Gestures: Gesture-Property Prediction as a Tool for Generating Representational Gestures from Speech

We propose a new framework for gesture generation, aiming to allow data-...
research
03/01/2021

GestureMap: Supporting Visual Analytics and Quantitative Analysis of Motion Elicitation Data by Learning 2D Embeddings

This paper presents GestureMap, a visual analytics tool for gesture elic...
research
09/04/2019

Learning to gesticulate by observation using a deep generative approach

The goal of the system presented in this paper is to develop a natural t...
research
03/04/2021

Toward Automated Generation of Affective Gestures from Text:A Theory-Driven Approach

Communication in both human-human and human-robot interac-tion (HRI) con...
research
04/03/2020

VGPN: Voice-Guided Pointing Robot Navigation for Humans

Pointing gestures are widely used in robot navigationapproaches nowadays...
research
05/02/2023

AQ-GT: a Temporally Aligned and Quantized GRU-Transformer for Co-Speech Gesture Synthesis

The generation of realistic and contextually relevant co-speech gestures...
research
09/15/2015

Free-body Gesture Tracking and Augmented Reality Improvisation for Floor and Aerial Dance

This paper describes an updated interactive performance system for floor...

Please sign up or login with your details

Forgot password? Click here to reset