Zoomorphic Gestures for Communicating Cobot States

02/22/2021
by   Vanessa Sauer, et al.
0

Communicating the robot state is vital to creating an efficient and trustworthy collaboration between humans and collaborative robots (cobots). Standard approaches for Robot-to-human communication face difficulties in industry settings, e.g., because of high noise levels or certain visibility requirements. Therefore, this paper presents zoomorphic gestures based on dog body language as a possible alternative for communicating the state of appearance-constrained cobots. For this purpose, we conduct a visual communication benchmark comparing zoomorphic gestures, abstract gestures, and light displays. We investigate the modalities regarding intuitive understanding, user experience, and user preference. In a first user study (n = 93), we evaluate our proposed design guidelines for all visual modalities. A second user study (n = 214) constituting the benchmark indicates that intuitive understanding and user experience are highest for both gesture-based modalities. Furthermore, zoomorphic gestures are considerably preferred over other modalities. These findings indicate that zoomorphic gestures with their playful nature are especially suitable for novel users and may decrease initial inhibitions.

READ FULL TEXT

page 3

page 6

research
01/24/2023

Context-aware robot control using gesture episodes

Collaborative robots became a popular tool for increasing productivity i...
research
01/10/2020

Recognition and Localisation of Pointing Gestures using a RGB-D Camera

Non-verbal communication is part of our regular conversation, and multip...
research
02/15/2023

MAGIC: Manipulating Avatars and Gestures to Improve Remote Collaboration

Remote collaborative work has become pervasive in many settings, from en...
research
02/12/2022

"I Don't Want People to Look At Me Differently": Designing User-Defined Above-the-Neck Gestures for People with Upper Body Motor Impairments

Recent research proposed eyelid gestures for people with upper-body moto...
research
09/19/2023

XR Input Error Mediation for Hand-Based Input: Task and Context Influences a User's Preference

Many XR devices use bare-hand gestures to reduce the need for handheld c...
research
01/29/2019

Guidelines for creating man-machine multimodal interfaces

Understanding details of human multimodal interaction can elucidate many...

Please sign up or login with your details

Forgot password? Click here to reset