Visual-Tactile Multimodality for Following Deformable Linear Objects Using Reinforcement Learning

03/31/2022
by   Leszek Pecyna, et al.
0

Manipulation of deformable objects is a challenging task for a robot. It will be problematic to use a single sensory input to track the behaviour of such objects: vision can be subjected to occlusions, whereas tactile inputs cannot capture the global information that is useful for the task. In this paper, we study the problem of using vision and tactile inputs together to complete the task of following deformable linear objects, for the first time. We create a Reinforcement Learning agent using different sensing modalities and investigate how its behaviour can be boosted using visual-tactile fusion, compared to using a single sensing modality. To this end, we developed a benchmark in simulation for manipulating the deformable linear objects using multimodal sensing inputs. The policy of the agent uses distilled information, e.g., the pose of the object in both visual and tactile perspectives, instead of the raw sensing signals, so that it can be directly transferred to real environments. In this way, we disentangle the perception system and the learned control policy. Our extensive experiments show that the use of both vision and tactile inputs, together with proprioception, allows the agent to complete the task in up to 92 results can provide valuable insights for the future design of tactile sensors and for deformable objects manipulation.

READ FULL TEXT

page 1

page 6

page 7

research
04/11/2023

Dexterous In-Hand Manipulation of Slender Cylindrical Objects through Deep Reinforcement Learning with Tactile Sensing

Continuous in-hand manipulation is an important physical interaction ski...
research
04/02/2021

Tactile-RL for Insertion: Generalization to Objects of Unknown Geometry

Object insertion is a classic contact-rich manipulation task. The task r...
research
09/26/2022

Learning Self-Supervised Representations from Vision and Touch for Active Sliding Perception of Deformable Surfaces

Humans make extensive use of vision and touch as complementary senses, w...
research
09/18/2023

General In-Hand Object Rotation with Vision and Touch

We introduce RotateIt, a system that enables fingertip-based object rota...
research
07/14/2022

Visuo-Tactile Manipulation Planning Using Reinforcement Learning with Affordance Representation

Robots are increasingly expected to manipulate objects in ever more unst...
research
03/08/2019

Learning to Identify Object Instances by Touch: Tactile Recognition via Multimodal Matching

Much of the literature on robotic perception focuses on the visual modal...
research
11/21/2019

Visual Tactile Fusion Object Clustering

Object clustering, aiming at grouping similar objects into one cluster w...

Please sign up or login with your details

Forgot password? Click here to reset