Dexterity from Touch: Self-Supervised Pre-Training of Tactile Representations with Robotic Play

03/21/2023
by   Irmak Güzey, et al.
0

Teaching dexterity to multi-fingered robots has been a longstanding challenge in robotics. Most prominent work in this area focuses on learning controllers or policies that either operate on visual observations or state estimates derived from vision. However, such methods perform poorly on fine-grained manipulation tasks that require reasoning about contact forces or about objects occluded by the hand itself. In this work, we present T-Dex, a new approach for tactile-based dexterity, that operates in two phases. In the first phase, we collect 2.5 hours of play data, which is used to train self-supervised tactile encoders. This is necessary to bring high-dimensional tactile readings to a lower-dimensional embedding. In the second phase, given a handful of demonstrations for a dexterous task, we learn non-parametric policies that combine the tactile observations with visual ones. Across five challenging dexterous tasks, we show that our tactile-based dexterity models outperform purely vision and torque-based models by an average of 1.7X. Finally, we provide a detailed analysis on factors critical to T-Dex including the importance of play data, architectures, and representation learning.

READ FULL TEXT

page 4

page 6

page 7

page 14

page 15

page 16

page 17

page 18

research
09/26/2022

Learning Self-Supervised Representations from Vision and Touch for Active Sliding Perception of Deformable Surfaces

Humans make extensive use of vision and touch as complementary senses, w...
research
09/21/2023

See to Touch: Learning Tactile Dexterity through Visual Incentives

Equipping multi-fingered robots with tactile sensing is crucial for achi...
research
03/08/2019

Learning to Identify Object Instances by Touch: Tactile Recognition via Multimodal Matching

Much of the literature on robotic perception focuses on the visual modal...
research
11/22/2022

Touch and Go: Learning from Human-Collected Vision and Touch

The ability to associate touch with sight is essential for tasks that re...
research
10/03/2022

That Sounds Right: Auditory Self-Supervision for Dynamic Robot Manipulation

Learning to produce contact-rich, dynamic behaviors from raw sensory dat...
research
03/11/2022

Masked Visual Pre-training for Motor Control

This paper shows that self-supervised visual pre-training from real-worl...
research
10/04/2022

Safely Learning Visuo-Tactile Feedback Policies in Real For Industrial Insertion

Industrial insertion tasks are often performed repetitively with parts t...

Please sign up or login with your details

Forgot password? Click here to reset