iGibson, a Simulation Environment for Interactive Tasks in Large Realistic Scenes

12/05/2020
by   Bokui Shen, et al.
9

We present iGibson, a novel simulation environment to develop robotic solutions for interactive tasks in large-scale realistic scenes. Our environment contains fifteen fully interactive home-sized scenes populated with rigid and articulated objects. The scenes are replicas of 3D scanned real-world homes, aligning the distribution of objects and layout to that of the real world. iGibson integrates several key features to facilitate the study of interactive tasks: i) generation of high-quality visual virtual sensor signals (RGB, depth, segmentation, LiDAR, flow, among others), ii) domain randomization to change the materials of the objects (both visual texture and dynamics) and/or their shapes, iii) integrated sampling-based motion planners to generate collision-free trajectories for robot bases and arms, and iv) intuitive human-iGibson interface that enables efficient collection of human demonstrations. Through experiments, we show that the full interactivity of the scenes enables agents to learn useful visual representations that accelerate the training of downstream manipulation tasks. We also show that iGibson features enable the generalization of navigation agents, and that the human-iGibson interface and integrated motion planners facilitate efficient imitation learning of simple human demonstrated behaviors. iGibson is open-sourced with comprehensive examples and documentation. For more information, visit our project website: http://svl.stanford.edu/igibson/

READ FULL TEXT

page 1

page 4

page 6

page 9

page 11

research
12/14/2017

AI2-THOR: An Interactive 3D Environment for Visual AI

We introduce The House Of inteRactions (THOR), a framework for visual AI...
research
07/16/2020

SAILenv: Learning in Virtual Visual Environments Made Simple

Recently, researchers in Machine Learning algorithms, Computer Vision sc...
research
12/08/2022

Phone2Proc: Bringing Robust Robots Into Our Chaotic World

Training embodied agents in simulation has become mainstream for the emb...
research
11/29/2017

HoME: a Household Multimodal Environment

We introduce HoME: a Household Multimodal Environment for artificial age...
research
01/18/2023

NeRF in the Palm of Your Hand: Corrective Augmentation for Robotics via Novel-View Synthesis

Expert demonstrations are a rich source of supervision for training visu...
research
03/24/2023

PACE: Data-Driven Virtual Agent Interaction in Dense and Cluttered Environments

We present PACE, a novel method for modifying motion-captured virtual ag...
research
08/06/2021

iGibson 2.0: Object-Centric Simulation for Robot Learning of Everyday Household Tasks

Recent research in embodied AI has been boosted by the use of simulation...

Please sign up or login with your details

Forgot password? Click here to reset