Vision- and tactile-based continuous multimodal intention and attention recognition for safer physical human-robot interaction

06/22/2022
by   Christopher Yee Wong, et al.
0

Employing skin-like tactile sensors on robots enhances both the safety and usability of collaborative robots by adding the capability to detect human contact. Unfortunately, simple binary tactile sensors alone cannot determine the context of the human contact – whether it is a deliberate interaction or an unintended collision that requires safety manoeuvres. Many published methods classify discrete interactions using more advanced tactile sensors or by analysing joint torques. Instead, we propose to augment the intention recognition capabilities of simple binary tactile sensors by adding a robot-mounted camera for human posture analysis. Different interaction characteristics, including touch location, human pose, and gaze direction, are used to train a supervised machine learning algorithm to classify whether a touch is intentional or not with 92 intention recognition is significantly more accurate than monomodal analysis with the collaborative robot Baxter. Furthermore, our method can also continuously monitor interactions that fluidly change between intentional or unintentional by gauging the user's attention through gaze. If a user stops paying attention mid-task, the proposed intention and attention recognition algorithm can activate safety features to prevent unsafe interactions. In addition, the proposed method is robot and touch sensor layout agnostic and is complementary with other methods.

READ FULL TEXT

page 1

page 3

page 6

page 7

page 8

research
08/09/2021

Organization and Understanding of a Tactile Information Dataset TacAct During Physical Human-Robot Interactions

Advanced service robots require superior tactile intelligence to guarant...
research
08/06/2023

Customizing Textile and Tactile Skins for Interactive Industrial Robots

Tactile skins made from textiles enhance robot-human interaction by loca...
research
02/22/2021

HAIR: Head-mounted AR Intention Recognition

Human teams exhibit both implicit and explicit intention sharing. To fur...
research
09/17/2019

What Are You Looking at? Detecting Human Intention in Gaze based Human-Robot Interaction

In gaze based Human-Robot Interaction (HRI), it is important to determin...
research
02/27/2022

MOCA-S: A Sensitive Mobile Collaborative Robotic Assistant exploiting Low-Cost Capacitive Tactile Cover and Whole-Body Control

Safety is one of the most fundamental aspects of robotics, especially wh...
research
11/28/2017

The robot skin placement problem: a new technique to place triangular modules inside polygons

Providing robots with large-scale robot skin is a challenging goal, espe...
research
03/04/2020

Optimal Deep Learning for Robot Touch

This article illustrates the application of deep learning to robot touch...

Please sign up or login with your details

Forgot password? Click here to reset