Multimodal Shared Autonomy for Social Navigation Assistance of Telepresence Robots

10/17/2022
by   Kenechukwu C. Mbanisi, et al.
0

Mobile telepresence robots (MTRs) have become increasingly popular in the expanding world of remote work, providing new avenues for people to actively participate in activities at a distance. However, humans operating MTRs often have difficulty navigating in densely populated environments due to limited situation awareness and narrow field-of-view, which reduces user acceptance and satisfaction. Shared autonomy in navigation has been studied primarily in static environments or in situations where only one pedestrian interacts with the robot. We present a multimodal shared autonomy approach, leveraging visual and haptic guidance, to provide navigation assistance for remote operators in densely-populated environments. It uses a modified form of reciprocal velocity obstacles for generating safe control inputs while taking social proxemics constraints into account. Two different visual guidance designs, as well as haptic force rendering, were proposed to convey safe control input. We conducted a user study to compare the merits and limitations of multimodal navigation assistance to haptic or visual assistance alone on a shared navigation task. The study involved 15 participants operating a virtual telepresence robot in a virtual hall with moving pedestrians, using the different assistance modalities. We evaluated navigation performance, transparency and cooperation, as well as user preferences. Our results showed that participants preferred multimodal assistance with a visual guidance trajectory over haptic or visual modalities alone, although it had no impact on navigation performance. Additionally, we found that visual guidance trajectories conveyed a higher degree of understanding and cooperation than equivalent haptic cues in a navigation task.

READ FULL TEXT

page 1

page 4

page 5

research
12/01/2018

Conversations for Vision: Remote Sighted Assistants Helping People with Visual Impairments

People with visual impairment (PVI) must interact with a world they cann...
research
09/09/2019

Virtual Fixture Assistance for Suturing in Robot-Aided Pediatric Endoscopic Surgery

The limited workspace in pediatric endoscopic surgery makes surgical sut...
research
03/05/2023

Vision based Virtual Guidance for Navigation

This paper explores the impact of virtual guidance on mid-level represen...
research
10/18/2021

Enabling a Social Robot to Process Social Cues to Detect when to Help a User

It is important for socially assistive robots to be able to recognize wh...
research
03/03/2023

Improving Surgical Situational Awareness with Signed Distance Field: A Pilot Study in Virtual Reality

The introduction of image-guided surgical navigation (IGSN) has greatly ...
research
11/10/2020

VFH+ based shared control for remotely operated mobile robots

This paper addresses the problem of safe and efficient navigation in rem...
research
07/21/2021

Investigating External Interaction Modality and Design Between Automated Vehicles and Pedestrians at Crossings

In this study, we investigated the effectiveness and user acceptance of ...

Please sign up or login with your details

Forgot password? Click here to reset