One-Shot Imitation from Observing Humans via Domain-Adaptive Meta-Learning

by   Tianhe Yu, et al.

Humans and animals are capable of learning a new behavior by observing others perform the skill just once. We consider the problem of allowing a robot to do the same -- learning from a raw video pixels of a human, even when there is substantial domain shift in the perspective, environment, and embodiment between the robot and the observed human. Prior approaches to this problem have hand-specified how human and robot actions correspond and often relied on explicit human pose detection systems. In this work, we present an approach for one-shot learning from a video of a human by using human and robot demonstration data from a variety of previous tasks to build up prior knowledge through meta-learning. Then, combining this prior knowledge and only a single video demonstration from a human, the robot can perform the task that the human demonstrated. We show experiments on both a PR2 arm and a Sawyer arm, demonstrating that after meta-learning, the robot can learn to place, push, and pick-and-place new objects using just one video of a human performing the manipulation.


page 1

page 6

page 7

page 8


One-Shot Visual Imitation Learning via Meta-Learning

In order for a robot to be a generalist that can perform a wide range of...

A new benchmark for group distribution shifts in hand grasp regression for object manipulation. Can meta-learning raise the bar?

Understanding hand-object pose with computer vision opens the door to ne...

One-Shot Hierarchical Imitation Learning of Compound Visuomotor Tasks

We consider the problem of learning multi-stage vision-based tasks on a ...

MetaPix: Few-Shot Video Retargeting

We address the task of unsupervised retargeting of human actions from on...

ARTiS: Appearance-based Action Recognition in Task Space for Real-Time Human-Robot Collaboration

To have a robot actively supporting a human during a collaborative task,...

Heterogeneous Learning from Demonstration

The development of human-robot systems able to leverage the strengths of...

A Framework for Robot Manipulation: Skill Formalism, Meta Learning and Adaptive Control

In this paper we introduce a novel framework for expressing and learning...

Please sign up or login with your details

Forgot password? Click here to reset