Spatial Computing and Intuitive Interaction: Bringing Mixed Reality and Robotics Together

02/03/2022
by   Jeffrey Delmerico, et al.
0

Spatial computing – the ability of devices to be aware of their surroundings and to represent this digitally – offers novel capabilities in human-robot interaction. In particular, the combination of spatial computing and egocentric sensing on mixed reality devices enables them to capture and understand human actions and translate these to actions with spatial meaning, which offers exciting new possibilities for collaboration between humans and robots. This paper presents several human-robot systems that utilize these capabilities to enable novel robot use cases: mission planning for inspection, gesture-based control, and immersive teleoperation. These works demonstrate the power of mixed reality as a tool for human-robot interaction, and the potential of spatial computing and mixed reality to drive the future of human-robot interaction.

READ FULL TEXT

page 3

page 6

page 8

page 9

page 10

page 12

research
05/08/2022

Rebellion and Disobedience as Useful Tools in Human-Robot Interaction Research – The Handheld Robotics Case

This position paper argues on the utility of rebellion and disobedience ...
research
09/13/2019

Enabling Intuitive Human-Robot Teaming Using Augmented Reality and Gesture Control

Human-robot teaming offers great potential because of the opportunities ...
research
02/04/2019

Exploring Temporal Dependencies in Multimodal Referring Expressions with Mixed Reality

In collaborative tasks, people rely both on verbal and non-verbal cues s...
research
08/07/2021

TOKCS: Tool for Organizing Key Characteristics of VAM-HRI Systems

Frameworks have begun to emerge to categorize Virtual, Augmented, and Mi...
research
07/30/2018

Planning for Muscular and Peripersonal-Space Comfort during Human-Robot Forceful Collaboration

This paper presents a planning algorithm designed to improve cooperative...
research
10/08/2021

Toward a Wearable Biosensor Ecosystem on ROS 2 for Real-time Human-Robot Interaction Systems

Wearable biosensors can enable continuous human data capture, facilitati...
research
08/03/2021

An Analysis of Human-Robot Information Streams to Inform Dynamic Autonomy Allocation

A dynamic autonomy allocation framework automatically shifts how much co...

Please sign up or login with your details

Forgot password? Click here to reset