Deep Haptic Model Predictive Control for Robot-Assisted Dressing

09/27/2017
by   Zackory Erickson, et al.
0

Robot-assisted dressing offers an opportunity to benefit the lives of many people with disabilities, such as some older adults. However, robots currently lack common sense about the physical implications of their actions on people. The physical implications of dressing are complicated by non-rigid garments, which can result in a robot indirectly applying high forces to a person's body. We present a deep recurrent model that, when given a proposed action by the robot, predicts the forces a garment will apply to a person's body. We also show that a robot can provide better dressing assistance by using this model with model predictive control. The predictions made by our model only use haptic and kinematic observations from the robot's end effector, which are readily attainable. Collecting training data from real world physical human-robot interaction can be time consuming, costly, and put people at risk. Instead, we train our predictive model using data collected in an entirely self-supervised fashion from a physics-based simulation. We evaluated our approach with a PR2 robot that attempted to pull a hospital gown onto the arms of 10 human participants. With a 0.2s prediction horizon, our controller succeeded at high rates and lowered applied force while navigating the garment around a persons fist and elbow without getting caught. Shorter prediction horizons resulted in significantly reduced performance with the sleeve catching on the participants' fists and elbows, demonstrating the value of our model's predictions. These behaviors of mitigating catches emerged from our deep predictive model and the controller objective function, which primarily penalizes high forces.

READ FULL TEXT

page 1

page 3

page 4

page 5

page 6

page 7

research
04/03/2019

Multidimensional Capacitive Sensing for Robot-Assisted Dressing and Bathing

Robotic assistance presents an opportunity to benefit the lives of many ...
research
08/11/2022

Visual Haptic Reasoning: Estimating Contact Forces by Observing Deformable Object Interactions

Robotic manipulation of highly deformable cloth presents a promising opp...
research
09/14/2019

Developing Computational Models of Social Assistance to Guide Socially Assistive Robots

While there are many examples in which robots provide social assistance,...
research
09/16/2023

MonoForce: Self-supervised learning of physics-aware grey-box model for predicting the robot-terrain interaction

We introduce an explainable, physics-aware, and end-to-end differentiabl...
research
11/13/2020

Learning Predictive Models for Ergonomic Control of Prosthetic Devices

We present Model-Predictive Interaction Primitives – a robot learning fr...
research
05/23/2020

LQR-Assisted Whole-Body Control of a Wheeled Bipedal Robot with Kinematic Loops

We present a hierarchical whole-body controller leveraging the full rigi...
research
10/14/2021

Monitoring the Mental State of Cooperativeness for Guiding an Elderly Person in Sit-to-Stand Assistance

In providing physical assistance to elderly people, ensuring cooperative...

Please sign up or login with your details

Forgot password? Click here to reset