Adapting the Human: Leveraging Wearable Technology in HRI

12/10/2020
by   David Puljiz, et al.
0

Adhering to current HRI paradigms, all of the sensors, visualisation and legibility of actions and motions are borne by the robot or its working cell. This necessarily makes robots more complex or confines them into specialised, structured environments. We propose leveraging the state of the art of wearable technologies, such as augmented reality head mounted displays, smart watches, sensor tags and radio-frequency ranging, to "adapt" the human and reduce the requirements and complexity of robots.

READ FULL TEXT

page 1

page 2

page 3

research
02/20/2019

Development of Head-Mounted Projection Displays for Distributed, Collaborative, Augmented Reality Applications

Distributed systems technologies supporting 3D visualization and social ...
research
03/06/2021

Visualizing Robot Intent for Object Handovers with Augmented Reality

Humans are very skillful in communicating their intent for when and wher...
research
08/21/2023

Communicating Robot's Intentions while Assisting Users via Augmented Reality

This paper explores the challenges faced by assistive robots in effectiv...
research
01/15/2019

Sensorless Hand Guidance using Microsoft Hololens

Hand guidance of robots has proven to be a useful tool both for programm...
research
08/13/2019

General Hand Guidance Framework using Microsoft HoloLens

Hand guidance emerged from the safety requirements for collaborative rob...
research
03/28/2023

Inside-out Infrared Marker Tracking via Head Mounted Displays for Smart Robot Programming

Intuitive robot programming through use of tracked smart input devices r...

Please sign up or login with your details

Forgot password? Click here to reset