Touch and Go: Learning from Human-Collected Vision and Touch

11/22/2022
by   Fengyu Yang, et al.
0

The ability to associate touch with sight is essential for tasks that require physically interacting with objects in the world. We propose a dataset with paired visual and tactile data called Touch and Go, in which human data collectors probe objects in natural environments using tactile sensors, while simultaneously recording egocentric video. In contrast to previous efforts, which have largely been confined to lab settings or simulated environments, our dataset spans a large number of "in the wild" objects and scenes. To demonstrate our dataset's effectiveness, we successfully apply it to a variety of tasks: 1) self-supervised visuo-tactile feature learning, 2) tactile-driven image stylization, i.e., making the visual appearance of an object more consistent with a given tactile signal, and 3) predicting future frames of a tactile signal from visuo-tactile inputs.

READ FULL TEXT

page 1

page 4

page 6

page 9

page 10

page 22

page 24

research
03/24/2021

Under Pressure: Learning to Detect Slip with Barometric Tactile Sensors

The ability to perceive object slip through tactile feedback allows huma...
research
02/19/2022

Learning to Detect Slip with Barometric Tactile Sensors and a Temporal Convolutional Neural Network

The ability to perceive object slip via tactile feedback enables humans ...
research
12/31/2017

SenseNet: 3D Objects Database and Tactile Simulator

The majority of artificial intelligence research, as it relates from whi...
research
06/14/2019

Connecting Touch and Vision via Cross-Modal Prediction

Humans perceive the world using multi-modal sensory inputs such as visio...
research
03/21/2023

Dexterity from Touch: Self-Supervised Pre-Training of Tactile Representations with Robotic Play

Teaching dexterity to multi-fingered robots has been a longstanding chal...
research
03/12/2022

Tactile-ViewGCN: Learning Shape Descriptor from Tactile Data using Graph Convolutional Network

For humans, our "senses of touch" have always been necessary for our abi...
research
06/14/2019

ViTa-SLAM: A Bio-inspired Visuo-Tactile SLAM for Navigation while Interacting with Aliased Environments

RatSLAM is a rat hippocampus-inspired visual Simultaneous Localization a...

Please sign up or login with your details

Forgot password? Click here to reset