GestureMap: Supporting Visual Analytics and Quantitative Analysis of Motion Elicitation Data by Learning 2D Embeddings

03/01/2021
by   Hai Dang, et al.
0

This paper presents GestureMap, a visual analytics tool for gesture elicitation which directly visualises the space of gestures. Concretely, a Variational Autoencoder embeds gestures recorded as 3D skeletons on an interactive 2D map. GestureMap further integrates three computational capabilities to connect exploration to quantitative measures: Leveraging DTW Barycenter Averaging (DBA), we compute average gestures to 1) represent gesture groups at a glance; 2) compute a new consensus measure (variance around average gesture); and 3) cluster gestures with k-means. We evaluate GestureMap and its concepts with eight experts and an in-depth analysis of published data. Our findings show how GestureMap facilitates exploring large datasets and helps researchers to gain a visual understanding of elicited gesture spaces. It further opens new directions, such as comparing elicitations across studies. We discuss implications for elicitation studies and research, and opportunities to extend our approach to additional tasks in gesture elicitation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/22/2020

Quantitative analysis of robot gesticulation behavior

Social robot capabilities, such as talking gestures, are best produced u...
research
04/10/2021

Iterative Design of Gestures during Elicitation: A Gateway into User's Mental Models

The design of gestural interfaces through gesture elicitation studies is...
research
04/19/2022

GestureLens: Visual Analysis of Gestures in Presentation Videos

Appropriate gestures can enhance message delivery and audience engagemen...
research
09/15/2015

Free-body Gesture Tracking and Augmented Reality Improvisation for Floor and Aerial Dance

This paper describes an updated interactive performance system for floor...
research
05/26/2021

How Do Users Interact with an Error-Prone In-Air Gesture Recognizer?

We present results of two pilot studies that investigated human error be...
research
05/27/2020

Omnis Prædictio: Estimating the Full Spectrum of Human Performance with Stroke Gestures

Designing effective, usable, and widely adoptable stroke gesture command...
research
10/13/2020

Labeling the Phrase Set of the Conversation Agent, Rinna

Mapping spoken text to gestures is an important research area for robots...

Please sign up or login with your details

Forgot password? Click here to reset