Controllable Visual-Tactile Synthesis

05/04/2023
by   Ruihan Gao, et al.
0

Deep generative models have various content creation applications such as graphic design, e-commerce, and virtual Try-on. However, current works mainly focus on synthesizing realistic visual outputs, often ignoring other sensory modalities, such as touch, which limits physical interaction with users. In this work, we leverage deep generative models to create a multi-sensory experience where users can touch and see the synthesized object when sliding their fingers on a haptic surface. The main challenges lie in the significant scale discrepancy between vision and touch sensing and the lack of explicit mapping from touch sensing data to a haptic rendering device. To bridge this gap, we collect high-resolution tactile data with a GelSight sensor and create a new visuotactile clothing dataset. We then develop a conditional generative model that synthesizes both visual and tactile outputs from a single sketch. We evaluate our method regarding image quality and tactile rendering accuracy. Finally, we introduce a pipeline to render high-quality visual and tactile outputs on an electroadhesion-based haptic device for an immersive experience, allowing for challenging materials and editable sketch inputs.

READ FULL TEXT

page 7

page 16

page 17

page 18

page 19

page 20

page 21

page 22

research
06/14/2019

Connecting Touch and Vision via Cross-Modal Prediction

Humans perceive the world using multi-modal sensory inputs such as visio...
research
01/17/2023

Vis2Hap: Vision-based Haptic Rendering by Cross-modal Generation

To assist robots in teleoperation tasks, haptic rendering which allows h...
research
02/17/2019

"Touching to See" and "Seeing to Feel": Robotic Cross-modal SensoryData Generation for Visual-Tactile Perception

The integration of visual-tactile stimulus is common while humans perfor...
research
10/17/2021

Deep Tactile Experience: Estimating Tactile Sensor Output from Depth Sensor Data

Tactile sensing is inherently contact based. To use tactile data, robots...
research
09/18/2023

General In-Hand Object Rotation with Vision and Touch

We introduce RotateIt, a system that enables fingertip-based object rota...
research
07/28/2022

Rewriting Geometric Rules of a GAN

Deep generative models make visual content creation more accessible to n...
research
07/05/2023

Object Recognition System on a Tactile Device for Visually Impaired

People with visual impairments face numerous challenges when interacting...

Please sign up or login with your details

Forgot password? Click here to reset