Multiple Interactions Made Easy (MIME): Large Scale Demonstrations Data for Imitation

10/16/2018
by   Pratyusha Sharma, et al.
0

In recent years, we have seen an emergence of data-driven approaches in robotics. However, most existing efforts and datasets are either in simulation or focus on a single task in isolation such as grasping, pushing or poking. In order to make progress and capture the space of manipulation, we would need to collect a large-scale dataset of diverse tasks such as pouring, opening bottles, stacking objects etc. But how does one collect such a dataset? In this paper, we present the largest available robotic-demonstration dataset (MIME) that contains 8260 human-robot demonstrations over 20 different robotic tasks (https://sites.google.com/view/mimedataset). These tasks range from the simple task of pushing objects to the difficult task of stacking household objects. Our dataset consists of videos of human demonstrations and kinesthetic trajectories of robot demonstrations. We also propose to use this dataset for the task of mapping 3rd person video features to robot trajectories. Furthermore, we present two different approaches using this dataset and evaluate the predicted robot trajectories against ground-truth trajectories. We hope our dataset inspires research in multiple areas including visual imitation, trajectory prediction, and multi-task robotic learning.

READ FULL TEXT

page 2

page 4

page 5

page 6

page 7

research
03/13/2020

Learning to Generalize Across Long-Horizon Tasks from Human Demonstrations

Imitation learning is an effective and safe technique to train robot pol...
research
09/27/2021

Bridge Data: Boosting Generalization of Robotic Skills with Cross-Domain Datasets

Robot learning holds the promise of learning policies that generalize br...
research
09/18/2023

DFL-TORO: A One-Shot Demonstration Framework for Learning Time-Optimal Robotic Manufacturing Tasks

This paper presents DFL-TORO, a novel Demonstration Framework for Learni...
research
09/28/2021

Learning Periodic Tasks from Human Demonstrations

We develop a method for learning periodic tasks from visual demonstratio...
research
06/16/2020

Learning from Demonstration with Weakly Supervised Disentanglement

Robotic manipulation tasks, such as wiping with a soft sponge, require c...
research
09/26/2019

RLBench: The Robot Learning Benchmark Learning Environment

We present a challenging new benchmark and learning-environment for robo...
research
08/06/2021

iGibson 2.0: Object-Centric Simulation for Robot Learning of Everyday Household Tasks

Recent research in embodied AI has been boosted by the use of simulation...

Please sign up or login with your details

Forgot password? Click here to reset