Grasp Stability Assessment Through Attention-Guided Cross-Modality Fusion and Transfer Learning

08/02/2023
by   Zhuangzhuang Zhang, et al.
0

Extensive research has been conducted on assessing grasp stability, a crucial prerequisite for achieving optimal grasping strategies, including the minimum force grasping policy. However, existing works employ basic feature-level fusion techniques to combine visual and tactile modalities, resulting in the inadequate utilization of complementary information and the inability to model interactions between unimodal features. This work proposes an attention-guided cross-modality fusion architecture to comprehensively integrate visual and tactile features. This model mainly comprises convolutional neural networks (CNNs), self-attention, and cross-attention mechanisms. In addition, most existing methods collect datasets from real-world systems, which is time-consuming and high-cost, and the datasets collected are comparatively limited in size. This work establishes a robotic grasping system through physics simulation to collect a multimodal dataset. To address the sim-to-real transfer gap, we propose a migration strategy encompassing domain randomization and domain adaptation techniques. The experimental results demonstrate that the proposed fusion framework achieves markedly enhanced prediction performance (approximately 10 that the trained model can be reliably transferred to real robotic systems, indicating its potential to address real-world challenges.

READ FULL TEXT

page 1

page 6

page 7

research
06/23/2020

Grasp State Assessment of Deformable Objects Using Visual-Tactile Fusion Perception

Humans can quickly determine the force required to grasp a deformable ob...
research
09/18/2022

VisTaNet: Attention Guided Deep Fusion for Surface Roughness Classification

Human texture perception is a weighted average of multi-sensory inputs: ...
research
05/30/2019

Bayesian Grasp: Robotic visual stable grasp based on prior tactile knowledge

Robotic grasp detection is a fundamental capability for intelligent mani...
research
10/16/2017

The Feeling of Success: Does Touch Sensing Help Predict Grasp Outcomes?

A successful grasp requires careful balancing of the contact forces. Ded...
research
02/27/2023

Visuo-Tactile-Based Slip Detection Using A Multi-Scale Temporal Convolution Network

Humans can accurately determine whether the object in hand has slipped o...
research
02/24/2022

When Transformer Meets Robotic Grasping: Exploits Context for Efficient Grasp Detection

In this paper, we present a transformer-based architecture, namely TF-Gr...
research
05/05/2023

Clothes Grasping and Unfolding Based on RGB-D Semantic Segmentation

Clothes grasping and unfolding is a core step in robotic-assisted dressi...

Please sign up or login with your details

Forgot password? Click here to reset