Visual-tactile sensing for Real-time liquid Volume Estimation in Grasping

02/23/2022
by   Fan Zhu, et al.
0

We propose a deep visuo-tactile model for realtime estimation of the liquid inside a deformable container in a proprioceptive way.We fuse two sensory modalities, i.e., the raw visual inputs from the RGB camera and the tactile cues from our specific tactile sensor without any extra sensor calibrations.The robotic system is well controlled and adjusted based on the estimation model in real time. The main contributions and novelties of our work are listed as follows: 1) Explore a proprioceptive way for liquid volume estimation by developing an end-to-end predictive model with multi-modal convolutional networks, which achieve a high precision with an error of around 2 ml in the experimental validation. 2) Propose a multi-task learning architecture which comprehensively considers the losses from both classification and regression tasks, and comparatively evaluate the performance of each variant on the collected data and actual robotic platform. 3) Utilize the proprioceptive robotic system to accurately serve and control the requested volume of liquid, which is continuously flowing into a deformable container in real time. 4) Adaptively adjust the grasping plan to achieve more stable grasping and manipulation according to the real-time liquid volume prediction.

READ FULL TEXT

page 2

page 5

page 7

research
03/02/2023

Learning to Detect Slip through Tactile Measures of the Contact Force Field and its Entropy

Detection of slip during object grasping and manipulation plays a vital ...
research
02/28/2021

Sim-to-Real Transfer for Robotic Manipulation with Tactile Sensory

Reinforcement Learning (RL) methods have been widely applied for robotic...
research
01/29/2021

Learning-based Optoelectronically Innervated Tactile Finger for Rigid-Soft Interactive Grasping

This paper presents a novel design of a soft tactile finger with omni-di...
research
05/19/2022

Action Conditioned Tactile Prediction: a case study on slip prediction

Tactile predictive models can be useful across several robotic manipulat...
research
03/20/2018

Multi-Modal Geometric Learning for Grasping and Manipulation

This work provides an architecture that incorporates depth and tactile i...
research
09/27/2019

Deep Gated Multi-modal Learning: In-hand Object Pose Estimation with Tactile and Image

In robot manipulation tasks, especially in-hand manipulation, estimation...
research
07/22/2020

Understanding Multi-Modal Perception Using Behavioral Cloning for Peg-In-a-Hole Insertion Tasks

One of the main challenges in peg-in-a-hole (PiH) insertion tasks is in ...

Please sign up or login with your details

Forgot password? Click here to reset