From Demonstrations to Task-Space Specifications: Using Causal Analysis to Extract Rule Parameterization from Demonstrations

06/08/2020
by   Daniel Angelov, et al.
0

Learning models of user behaviour is an important problem that is broadly applicable across many application domains requiring human-robot interaction. In this work, we show that it is possible to learn generative models for distinct user behavioural types, extracted from human demonstrations, by enforcing clustering of preferred task solutions within the latent space. We use these models to differentiate between user types and to find cases with overlapping solutions. Moreover, we can alter an initially guessed solution to satisfy the preferences that constitute a particular user type by backpropagating through the learned differentiable models. An advantage of structuring generative models in this way is that we can extract causal relationships between symbols that might form part of the user's specification of the task, as manifested in the demonstrations. We further parameterize these specifications through constraint optimization in order to find a safety envelope under which motion planning can be performed. We show that the proposed method is capable of correctly distinguishing between three user types, who differ in degrees of cautiousness in their motion, while performing the task of moving objects with a kinesthetically driven robot in a tabletop environment. Our method successfully identifies the correct type, within the specified time, in 99 baseline. We also show that our proposed method correctly changes a default trajectory to one satisfying a particular user specification even with unseen objects. The resulting trajectory is shown to be directly implementable on a PR2 humanoid robot completing the same task.

READ FULL TEXT
research
03/04/2019

Using Causal Analysis to Learn Specifications from Task Demonstrations

Learning models of user behaviour is an important problem that is broadl...
research
01/31/2019

Characterizing Input Methods for Human-to-robot Demonstrations

Human demonstrations are important in a range of robotics applications, ...
research
09/29/2018

Robot eye-hand coordination learning by watching human demonstrations: a task function approximation approach

We present a robot eye-hand coordination learning method that can direct...
research
02/03/2020

Elaborating on Learned Demonstrations with Temporal Logic Specifications

Most current methods for learning from demonstrations assume that those ...
research
10/09/2021

Credit Assignment Safety Learning from Human Demonstrations

A critical need in assistive robotics, such as assistive wheelchairs for...
research
07/24/2019

Improving User Specifications for Robot Behavior through Active Preference Learning: Framework and Evaluation

An important challenge in human robot interaction (HRI) is enabling non-...
research
09/21/2022

LMI-based Variable Impedance Controller design from User Demonstrations and Preferences

In this paper, we introduce a new off-line method to find suitable param...

Please sign up or login with your details

Forgot password? Click here to reset