Rotating Objects via In-Hand Pivoting using Vision, Force and Touch

03/20/2023
by   Shiyu Xu, et al.
0

We propose a robotic manipulation system that can pivot objects on a surface using vision, wrist force and tactile sensing. We aim to control the rotation of an object around the grip point of a parallel gripper by allowing rotational slip, while maintaining a desired wrist force profile. Our approach runs an end-effector position controller and a gripper width controller concurrently in a closed loop. The position controller maintains a desired force using vision and wrist force. The gripper controller uses tactile sensing to keep the grip firm enough to prevent translational slip, but loose enough to induce rotational slip. Our sensor-based control approach relies on matching a desired force profile derived from object dimensions and weight and vision-based monitoring of the object pose. The gripper controller uses tactile sensors to detect and prevent translational slip by tightening the grip when needed. Experimental results where the robot was tasked with rotating cuboid objects 90 degrees show that the multi-modal pivoting approach was able to rotate the objects without causing lift or slip, and was more energy-efficient compared to using a single sensor modality and to pick-and-place. While our work demonstrated the benefit of multi-modal sensing for the pivoting task, further work is needed to generalize our approach to any given object.

READ FULL TEXT

page 1

page 5

page 7

research
10/03/2019

Cable Manipulation with a Tactile-Reactive Gripper

Manipulation of flexible cables is relevant to both industrial and house...
research
10/11/2022

In-Hand Gravitational Pivoting Using Tactile Sensing

We study gravitational pivoting, a constrained version of in-hand manipu...
research
07/02/2023

Leveraging Multi-modal Sensing for Robotic Insertion Tasks in R D Laboratories

Performing a large volume of experiments in Chemistry labs creates repet...
research
07/22/2020

Understanding Multi-Modal Perception Using Behavioral Cloning for Peg-In-a-Hole Insertion Tasks

One of the main challenges in peg-in-a-hole (PiH) insertion tasks is in ...
research
09/27/2021

Precision fruit tree pruning using a learned hybrid vision/interaction controller

Robotic tree pruning requires highly precise manipulator control in orde...
research
02/17/2022

Multi-Modal Fusion in Contact-Rich Precise Tasks via Hierarchical Policy Learning

Combined visual and force feedback play an essential role in contact-ric...
research
11/15/2022

Deep Instance Segmentation and Visual Servoing to Play Jenga with a Cost-Effective Robotic System

The game of Jenga represents an inspiring benchmark for developing innov...

Please sign up or login with your details

Forgot password? Click here to reset