Learning Intuitive Physics with Multimodal Generative Models

01/12/2021
by   Sahand Rezaei-Shoshtari, et al.
14

Predicting the future interaction of objects when they come into contact with their environment is key for autonomous agents to take intelligent and anticipatory actions. This paper presents a perception framework that fuses visual and tactile feedback to make predictions about the expected motion of objects in dynamic scenes. Visual information captures object properties such as 3D shape and location, while tactile information provides critical cues about interaction forces and resulting object motion when it makes contact with the environment. Utilizing a novel See-Through-your-Skin (STS) sensor that provides high resolution multimodal sensing of contact surfaces, our system captures both the visual appearance and the tactile properties of objects. We interpret the dual stream signals from the sensor using a Multimodal Variational Autoencoder (MVAE), allowing us to capture both modalities of contacting objects and to develop a mapping from visual to tactile interaction and vice-versa. Additionally, the perceptual system can be used to infer the outcome of future physical interactions, which we validate through simulated and real-world experiments in which the resting state of an object is predicted from given initial conditions.

READ FULL TEXT

page 2

page 5

page 6

page 7

page 10

page 12

page 13

page 14

research
11/18/2020

Seeing Through your Skin: Recognizing Objects with a Novel Visuotactile Sensor

We introduce a new class of vision-based sensor and associated algorithm...
research
09/20/2021

Efficient shape mapping through dense touch and vision

Knowledge of 3-D object shape is of great importance to robot manipulati...
research
10/07/2022

Learning the Dynamics of Compliant Tool-Environment Interaction for Visuo-Tactile Contact Servoing

Many manipulation tasks require the robot to control the contact between...
research
04/18/2022

Multimodal Proximity and Visuotactile Sensing With a Selectively Transmissive Soft Membrane

The most common sensing modalities found in a robot perception system ar...
research
09/10/2018

Multimodal feedback for active robot-object interaction

In this work, we present a multimodal system for active robot-object int...
research
04/20/2021

Tactile Perception based on Injected Vibration in Soft Sensor

Tactile perception using vibration sensation helps robots recognize thei...
research
04/29/2020

Teaching Cameras to Feel: Estimating Tactile Physical Properties of Surfaces From Images

The connection between visual input and tactile sensing is critical for ...

Please sign up or login with your details

Forgot password? Click here to reset