Omnis Prædictio: Estimating the Full Spectrum of Human Performance with Stroke Gestures

05/27/2020
by   Luis A. Leiva, et al.
0

Designing effective, usable, and widely adoptable stroke gesture commands for graphical user interfaces is a challenging task that traditionally involves multiple iterative rounds of prototyping, implementation, and follow-up user studies and controlled experiments for evaluation, verification, and validation. An alternative approach is to employ theoretical models of human performance, which can deliver practitioners with insightful information right from the earliest stages of user interface design. However, very few aspects of the large spectrum of human performance with stroke gesture input have been investigated and modeled so far, leaving researchers and practitioners of gesture-based user interface design with a very narrow range of predictable measures of human performance, mostly focused on estimating production time, of which extremely few cases delivered accompanying software tools to assist modeling. We address this problem by introducing "Omnis Praedictio" (Omnis for short), a generic technique and companion web tool that provides accurate user-independent estimations of any numerical stroke gesture feature, including custom features specified in code. Our experimental results on three public datasets show that our model estimations correlate on average r > .9 with groundtruth data. Omnis also enables researchers and practitioners to understand human performance with stroke gestures on many levels and, consequently, raises the bar for human performance models and estimation techniques for stroke gesture input.

READ FULL TEXT

page 41

page 42

research
04/10/2021

Iterative Design of Gestures during Elicitation: A Gateway into User's Mental Models

The design of gestural interfaces through gesture elicitation studies is...
research
04/25/2021

Comparing Hand Gestures and the Gamepad Interfaces for Locomotion in Virtual Environments

Hand gesture is a new and promising interface for locomotion in virtual ...
research
09/13/2023

User Training with Error Augmentation for Electromyogram-based Gesture Classification

We designed and tested a system for real-time control of a user interfac...
research
07/22/2022

GesSure – A Robust Face-Authentication enabled Dynamic Gesture Recognition GUI Application

Using physical interactive devices like mouse and keyboards hinders natu...
research
03/01/2021

GestureMap: Supporting Visual Analytics and Quantitative Analysis of Motion Elicitation Data by Learning 2D Embeddings

This paper presents GestureMap, a visual analytics tool for gesture elic...
research
05/18/2020

Designing Mid-Air Haptic Gesture Controlled User Interfaces for Cars

We present advancements in the design and development of in-vehicle info...
research
10/22/2019

Gesture Agreement Assessment Using Description Vectors

Participatory design is a popular design technique that involves the end...

Please sign up or login with your details

Forgot password? Click here to reset