Human Tactile Gesture Interpretation for Robotic Systems

12/03/2020 ∙ by Elizabeth Bibit Bianchini, et al. ∙ 0

Human-robot interactions are less efficient and communicative than human-to-human interactions, and a key reason is a lack of informed sense of touch in robotic systems. Existing literature demonstrates robot success in executing handovers with humans, albeit with substantial reliance on external sensing or with primitive signal processing methods, deficient compared to the rich set of information humans can detect. In contrast, we present models capable of distinguishing between four classes of human tactile gestures at a robot's end effector, using only a non-collocated six-axis force sensor at the wrist. Due to the absence in the literature, this work describes 1) the collection of an extensive force dataset characterized by human-robot contact events, and 2) classification models informed by this dataset to determine the nature of the interaction. We demonstrate high classification accuracies among our proposed gesture definitions on a test set, emphasizing that neural network classifiers on the raw data outperform several other combinations of algorithms and feature sets.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 2

page 4

page 5

page 6

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.