Automated acquisition of structured, semantic models of manipulation activities from human VR demonstration

11/27/2020
by   Andrei Haidu, et al.
0

In this paper we present a system capable of collecting and annotating, human performed, robot understandable, everyday activities from virtual environments. The human movements are mapped in the simulated world using off-the-shelf virtual reality devices with full body, and eye tracking capabilities. All the interactions in the virtual world are physically simulated, thus movements and their effects are closely relatable to the real world. During the activity execution, a subsymbolic data logger is recording the environment and the human gaze on a per-frame basis, enabling offline scene reproduction and replays. Coupled with the physics engine, online monitors (symbolic data loggers) are parsing (using various grammars) and recording events, actions, and their effects in the simulated world.

READ FULL TEXT

page 2

page 3

page 5

page 6

page 7

research
04/23/2021

Semi-Autonomous Planning and Visualization in Virtual Reality

Virtual reality (VR) interfaces for robots provide a three-dimensional (...
research
10/16/2018

UnrealROX: An eXtremely Photorealistic Virtual Reality Environment for Robotics Simulations and Synthetic Data Generation

Data-driven algorithms have surpassed traditional techniques in almost e...
research
02/21/2021

Towards Immersive Virtual Reality Simulations of Bionic Vision

Bionic vision is a rapidly advancing field aimed at developing visual ne...
research
04/15/2021

Spatial Knowledge Acquisition in Virtual and Physical Reality: A Comparative Evaluation

Virtual Reality (VR) head-mounted displays (HMDs) have been studied wide...
research
12/14/2022

Human-centric telerobotics: investigating users' performance and workload via VR-based eye-tracking measures

Virtual Reality (VR) is gaining ground in the robotics and teleoperation...
research
01/14/2023

A Reconfigurable Data Glove for Reconstructing Physical and Virtual Grasps

In this work, we present a reconfigurable data glove design to capture d...
research
08/06/2021

BEHAVIOR: Benchmark for Everyday Household Activities in Virtual, Interactive, and Ecological Environments

We introduce BEHAVIOR, a benchmark for embodied AI with 100 activities i...

Please sign up or login with your details

Forgot password? Click here to reset