Single-grasp deformable object discrimination: the effect of gripper morphology, sensing modalities, and action parameters

04/13/2022
by   Michal Pliska, et al.
0

We studied the discrimination of deformable objects by grasping them using 4 different robot hands / grippers: Barrett hand (3 fingers with adjustable configuration, 96 tactile, 8 position, 3 torque sensors), qb SoftHand (5 fingers, 1 motor, position and current feedback), and two industrial type parallel jaw grippers with position and effort feedback (Robotiq 2F-85 and OnRobot RG6). A set of 9 ordinary objects differing in size and stiffness and another highly challenging set of 20 polyurethane foams differing in material properties only was used. We systematically compare the grippers' performance, together with the effects of: (1) type of classifier (k-NN, SVM, LSTM) operating on raw time series or on features, (2) action parameters (grasping configuration and speed of squeezing), (3) contribution of sensory modalities. Classification results are complemented by visualization of the data using PCA. We found: (i) all the grippers but the qb SoftHand could reliably distinguish the ordinary objects set; (ii) Barrett Hand reached around 95 foams; OnRobot RG6 around 75 grippers, SVM over features and LSTM on raw time series performed best; (iv) faster compression speeds degrade classification performance; (v) transfer learning between compression speeds worked well for the Barrett Hand only; transfer between grasping configurations is limited; (vi) ablation experiments provided intriguing insights – sometimes a single sensory channel suffices for discrimination. Overall, the Barrett Hand as a complex and expensive device with rich sensory feedback provided best results, but uncalibrated parallel jaw grippers without tactile sensors can have sufficient performance for single-grasp object discrimination based on position and effort data only. Transfer learning between the different robot hands remains a challenge.

READ FULL TEXT

page 1

page 3

page 6

page 7

research
11/02/2020

Grasping in the Dark: Compliant Grasping using Shadow Dexterous Hand and BioTac Tactile Sensor

When it comes to grasping and manipulating objects, the human hand is th...
research
05/10/2018

Learning to Grasp Without Seeing

Can a robot grasp an unknown object without seeing it? In this paper, we...
research
05/28/2018

More Than a Feeling: Learning to Grasp and Regrasp using Vision and Touch

For humans, the process of grasping an object relies heavily on rich tac...
research
07/17/2019

Tactile Model O: Fabrication and testing of a 3d-printed, three-fingered tactile robot hand

Bringing tactile sensation to robotic hands will allow for more effectiv...
research
03/23/2023

TactoFind: A Tactile Only System for Object Retrieval

We study the problem of object retrieval in scenarios where visual sensi...
research
06/02/2021

Grasp stability prediction with time series data based on STFT and LSTM

With an increasing demand for robots, robotic grasping will has a more i...

Please sign up or login with your details

Forgot password? Click here to reset