Using Causal Analysis to Learn Specifications from Task Demonstrations

03/04/2019
by   Daniel Angelov, et al.
0

Learning models of user behaviour is an important problem that is broadly applicable across many application domains requiring human-robot interaction. In this work we show that it is possible to learn a generative model for distinct user behavioral types, extracted from human demonstrations, by enforcing clustering of preferred task solutions within the latent space. We use this model to differentiate between user types and to find cases with overlapping solutions. Moreover, we can alter an initially guessed solution to satisfy the preferences that constitute a particular user type by backpropagating through the learned differentiable model. An advantage of structuring generative models in this way is that it allows us to extract causal relationships between symbols that might form part of the user's specification of the task, as manifested in the demonstrations. We show that the proposed method is capable of correctly distinguishing between three user types, who differ in degrees of cautiousness in their motion, while performing the task of moving objects with a kinesthetically driven robot in a tabletop environment. Our method successfully identifies the correct type, within the specified time, in 99 baseline. We also show that our proposed method correctly changes a default trajectory to one satisfying a particular user specification even with unseen objects. The resulting trajectory is shown to be directly implementable on a PR2 humanoid robot completing the same task.

READ FULL TEXT
research
06/08/2020

From Demonstrations to Task-Space Specifications: Using Causal Analysis to Extract Rule Parameterization from Demonstrations

Learning models of user behaviour is an important problem that is broadl...
research
09/29/2018

Robot eye-hand coordination learning by watching human demonstrations: a task function approximation approach

We present a robot eye-hand coordination learning method that can direct...
research
01/31/2019

Characterizing Input Methods for Human-to-robot Demonstrations

Human demonstrations are important in a range of robotics applications, ...
research
05/24/2014

Efficient Model Learning for Human-Robot Collaborative Tasks

We present a framework for learning human user models from joint-action ...
research
02/03/2020

Elaborating on Learned Demonstrations with Temporal Logic Specifications

Most current methods for learning from demonstrations assume that those ...
research
09/21/2022

LMI-based Variable Impedance Controller design from User Demonstrations and Preferences

In this paper, we introduce a new off-line method to find suitable param...
research
07/24/2023

Advancing Robot Autonomy for Long-Horizon Tasks

Autonomous robots have real-world applications in diverse fields, such a...

Please sign up or login with your details

Forgot password? Click here to reset