Understanding the Predictability of Gesture Parameters from Speech and their Perceptual Importance

10/02/2020
by   Ylva Ferstl, et al.
0

Gesture behavior is a natural part of human conversation. Much work has focused on removing the need for tedious hand-animation to create embodied conversational agents by designing speech-driven gesture generators. However, these generators often work in a black-box manner, assuming a general relationship between input speech and output motion. As their success remains limited, we investigate in more detail how speech may relate to different aspects of gesture motion. We determine a number of parameters characterizing gesture, such as speed and gesture size, and explore their relationship to the speech signal in a two-fold manner. First, we train multiple recurrent networks to predict the gesture parameters from speech to understand how well gesture attributes can be modeled from speech alone. We find that gesture parameters can be partially predicted from speech, and some parameters, such as path length, being predicted more accurately than others, like velocity. Second, we design a perceptual study to assess the importance of each gesture parameter for producing motion that people perceive as appropriate for the speech. Results show that a degradation in any parameter was viewed negatively, but some changes, such as hand shape, are more impactful than others. A video summarization can be found at https://youtu.be/aw6-_5kmLjY.

READ FULL TEXT
research
03/04/2021

It's A Match! Gesture Generation Using Expressive Parameter Matching

Automatic gesture generation from speech generally relies on implicit mo...
research
08/22/2022

The GENEA Challenge 2022: A large evaluation of data-driven co-speech gesture generation

This paper reports on the second GENEA Challenge to benchmark data-drive...
research
03/15/2023

Evaluating gesture-generation in a large-scale open challenge: The GENEA Challenge 2022

This paper reports on the second GENEA Challenge to benchmark data-drive...
research
08/12/2021

Multimodal analysis of the predictability of hand-gesture properties

Embodied conversational agents benefit from being able to accompany thei...
research
03/08/2019

Analyzing Input and Output Representations for Speech-Driven Gesture Generation

This paper presents a novel framework for automatic speech-driven gestur...
research
07/16/2020

Moving fast and slow: Analysis of representations and post-processing in speech-driven automatic gesture generation

This paper presents a novel framework for speech-driven gesture producti...
research
08/24/2023

The GENEA Challenge 2023: A large scale evaluation of gesture generation models in monadic and dyadic settings

This paper reports on the GENEA Challenge 2023, in which participating t...

Please sign up or login with your details

Forgot password? Click here to reset