Towards Intention Prediction for Handheld Robots: a Case of Simulated Block Copying

10/15/2018
by   Janis Stolzenwald, et al.
0

Within this work, we explore intention inference for user actions in the context of a handheld robot setup. Handheld robots share the shape and properties of handheld tools while being able to process task information and aid manipulation. Here, we propose an intention prediction model to enhance cooperative task solving. Within a block copy task, we collect eye gaze data using a robot-mounted remote eye tracker which is used to create a profile of visual attention for task-relevant objects in the workspace scene. These profiles are used to make predictions about user actions i.e. which block will be picked up next and where it will be placed. Our results show that our proposed model can predict user actions well in advance with an accuracy of 87.94 actions.

READ FULL TEXT

page 1

page 3

research
03/19/2019

Rebellion and Obedience: The Effects of Intention Prediction in Cooperative Handheld Robots

Within this work, we explore intention inference for user actions in the...
research
02/09/2023

Gaze-based intention estimation: principles, methodologies, and applications in HRI

Intention prediction has become a relevant field of research in Human-Ma...
research
10/28/2022

HaptiX: Extending Cobot's Motion Intention Visualization by Haptic Feedback

Nowadays, robots are found in a growing number of areas where they colla...
research
09/13/2022

What You See is What You Grasp: User-Friendly Grasping Guided by Near-eye-tracking

This work presents a next-generation human-robot interface that can infe...
research
08/18/2022

Intention estimation from gaze and motion features for human-robot shared-control object manipulation

Shared control can help in teleoperated object manipulation by assisting...
research
09/05/2017

Using Cross-Model EgoSupervision to Learn Cooperative Basketball Intention

We present a first-person method for cooperative basketball intention pr...
research
03/04/2021

Gaze-contingent decoding of human navigation intention on an autonomous wheelchair platform

We have pioneered the Where-You-Look-Is Where-You-Go approach to control...

Please sign up or login with your details

Forgot password? Click here to reset