The scaling of the gripper affects the action and perception in teleoperated grasping via a robot-assisted minimally invasive surgery system

10/15/2017
by   Amit Milstein, et al.
0

We use psychophysics to investigate human-centered transparency of grasping in unilateral robot-assisted minimally invasive surgery (RAMIS) without force feedback. Instead of the classical definition of transparency, we define here a human-centered transparency, focusing on natural action and perception in RAMIS. Demonstrating this approach, we assess the effect of gripper scaling on human-centered transparency in teleoperated grasping of rigid objects. Thirty-one participants were divided into three groups, with different scaling between the opening of the gripper of the surgeon-side manipulator and the gripper of the surgical instrument. Each participant performed two experiments: (1) action experiment: reaching and grasping of different cylinders; and (2) perception experiment: reporting the size of the cylinders. In two out of three gripper scaling, teleoperated grasping was similar to natural grasping. In the action experiment, the maximal grip aperture of the surgical instrument was proportional to the size of the grasped object, and its variability did not depend on object size, whereas in the perception experiment, consistently with Weber's law, variability of perceived size increased with size. In the fine gripper scaling, action and perception variabilities decreased with size, suggesting reduced transparency. These results suggest that in our RAMIS system, if the gripper scaling is not too fine, grasping kinematics and the gap between action and perception are similar to natural grasping. This means that in the context of grasping, our system induces natural grasping behavior and is human-centered transparent. We anticipate that using psychophysics for optimizing human-centered teleoperation control will eventually improve the usability of RAMIS.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

09/15/2021

Fusing Visuo-Tactile Perception into Kernelized Synergies for Robust Grasping and Fine Manipulation of Non-rigid Objects

Handling non-rigid objects using robot hands necessities a framework tha...
07/23/2019

Grasping Using Tactile Sensing and Deep Calibration

Tactile perception is an essential ability of intelligent robots in inte...
04/29/2022

Eyes on the Prize: Improved Perception for Robust Dynamic Grasping

This paper is concerned with perception challenges for robust grasping i...
08/15/2018

Real-time policy generation and its application to robot grasping

Real time applications such as robotic require real time actions based o...
10/27/2020

Optimization of Robot Grasping Forces and Worst Case Loading

We consider the optimization of the vector of grasping forces that suppo...
04/09/2018

Automated pick-up of suturing needles for robotic surgical assistance

Robot-assisted laparoscopic prostatectomy (RALP) is a treatment for pros...
04/06/2018

Human Robot Interface for Assistive Grasping

This work describes a new human-in-the-loop (HitL) assistive grasping sy...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.