One-Shot Imitation from Observing Humans via Domain-Adaptive Meta-Learning

02/05/2018
by   Tianhe Yu, et al.
0

Humans and animals are capable of learning a new behavior by observing others perform the skill just once. We consider the problem of allowing a robot to do the same -- learning from a raw video pixels of a human, even when there is substantial domain shift in the perspective, environment, and embodiment between the robot and the observed human. Prior approaches to this problem have hand-specified how human and robot actions correspond and often relied on explicit human pose detection systems. In this work, we present an approach for one-shot learning from a video of a human by using human and robot demonstration data from a variety of previous tasks to build up prior knowledge through meta-learning. Then, combining this prior knowledge and only a single video demonstration from a human, the robot can perform the task that the human demonstrated. We show experiments on both a PR2 arm and a Sawyer arm, demonstrating that after meta-learning, the robot can learn to place, push, and pick-and-place new objects using just one video of a human performing the manipulation.

READ FULL TEXT

page 1

page 6

page 7

page 8

research
09/14/2017

One-Shot Visual Imitation Learning via Meta-Learning

In order for a robot to be a generalist that can perform a wide range of...
research
10/31/2022

A new benchmark for group distribution shifts in hand grasp regression for object manipulation. Can meta-learning raise the bar?

Understanding hand-object pose with computer vision opens the door to ne...
research
10/25/2018

One-Shot Hierarchical Imitation Learning of Compound Visuomotor Tasks

We consider the problem of learning multi-stage vision-based tasks on a ...
research
10/10/2019

MetaPix: Few-Shot Video Retargeting

We address the task of unsupervised retargeting of human actions from on...
research
10/18/2016

ARTiS: Appearance-based Action Recognition in Task Space for Real-Time Human-Robot Collaboration

To have a robot actively supporting a human during a collaborative task,...
research
01/27/2020

Heterogeneous Learning from Demonstration

The development of human-robot systems able to leverage the strengths of...
research
05/22/2018

A Framework for Robot Manipulation: Skill Formalism, Meta Learning and Adaptive Control

In this paper we introduce a novel framework for expressing and learning...

Please sign up or login with your details

Forgot password? Click here to reset