Visual Haptic Reasoning: Estimating Contact Forces by Observing Deformable Object Interactions

08/11/2022
by   Yufei Wang, et al.
0

Robotic manipulation of highly deformable cloth presents a promising opportunity to assist people with several daily tasks, such as washing dishes; folding laundry; or dressing, bathing, and hygiene assistance for individuals with severe motor impairments. In this work, we introduce a formulation that enables a collaborative robot to perform visual haptic reasoning with cloth – the act of inferring the location and magnitude of applied forces during physical interaction. We present two distinct model representations, trained in physics simulation, that enable haptic reasoning using only visual and robot kinematic observations. We conducted quantitative evaluations of these models in simulation for robot-assisted dressing, bathing, and dish washing tasks, and demonstrate that the trained models can generalize across different tasks with varying interactions, human body sizes, and object shapes. We also present results with a real-world mobile manipulator, which used our simulation-trained models to estimate applied contact forces while performing physically assistive tasks with cloth. Videos can be found at our project webpage.

READ FULL TEXT

page 1

page 7

research
09/10/2021

Bodies Uncovered: Learning to Manipulate Real Blankets Around People via Physics Simulations

While robots present an opportunity to provide physical assistance to ol...
research
08/28/2017

Active Animations of Reduced Deformable Models with Environment Interactions

We present an efficient spacetime optimization method to automatically g...
research
09/27/2017

Deep Haptic Model Predictive Control for Robot-Assisted Dressing

Robot-assisted dressing offers an opportunity to benefit the lives of ma...
research
03/02/2022

DextAIRity: Deformable Manipulation Can be a Breeze

This paper introduces DextAIRity, an approach to manipulate deformable o...
research
07/11/2018

Using Contact to Increase Robot Performance for Glovebox D&D Tasks

Glovebox decommissioning tasks usually require manipulating relatively h...
research
09/21/2023

ForceSight: Text-Guided Mobile Manipulation with Visual-Force Goals

We present ForceSight, a system for text-guided mobile manipulation that...

Please sign up or login with your details

Forgot password? Click here to reset