Human POSEitioning System (HPS): 3D Human Pose Estimation and Self-localization in Large Scenes from Body-Mounted Sensors

03/31/2021
by   Vladimir Guzov, et al.
0

We introduce (HPS) Human POSEitioning System, a method to recover the full 3D pose of a human registered with a 3D scan of the surrounding environment using wearable sensors. Using IMUs attached at the body limbs and a head mounted camera looking outwards, HPS fuses camera based self-localization with IMU-based human body tracking. The former provides drift-free but noisy position and orientation estimates while the latter is accurate in the short-term but subject to drift over longer periods of time. We show that our optimization-based integration exploits the benefits of the two, resulting in pose accuracy free of drift. Furthermore, we integrate 3D scene constraints into our optimization, such as foot contact with the ground, resulting in physically plausible motion. HPS complements more common third-person-based 3D pose estimation methods. It allows capturing larger recording volumes and longer periods of motion, and could be used for VR/AR applications where humans interact with the scene without requiring direct line of sight with an external camera, or to train agents that navigate and interact with the environment based on first-person visual input, like real humans. With HPS, we recorded a dataset of humans interacting with large 3D scenes (300-1000 sq.m) consisting of 7 subjects and more than 3 hours of diverse motion. The dataset, code and video will be available on the project page: http://virtualhumans.mpi-inf.mpg.de/hps/ .

READ FULL TEXT

page 1

page 3

page 4

page 7

page 8

research
08/20/2019

Resolving 3D Human Pose Ambiguities with 3D Scene Constraints

To understand and analyze human behavior, we need to capture humans movi...
research
05/05/2022

Visually plausible human-object interaction capture from wearable sensors

In everyday lives, humans naturally modify the surrounding environment t...
research
12/05/2019

Generating 3D People in Scenes without People

We present a fully-automatic system that takes a 3D scene and generates ...
research
07/20/2020

Wearable camera-based human absolute localization in large warehouses

In a robotised warehouse, as in any place where robots move autonomously...
research
12/21/2020

Populating 3D Scenes by Learning Human-Scene Interaction

Humans live within a 3D space and constantly interact with it to perform...
research
03/17/2022

HSC4D: Human-centered 4D Scene Capture in Large-scale Indoor-outdoor Space Using Wearable IMUs and LiDAR

We propose Human-centered 4D Scene Capture (HSC4D) to accurately and eff...
research
04/11/2023

Animation Fidelity in Self-Avatars: Impact on User Performance and Sense of Agency

The use of self-avatars is gaining popularity thanks to affordable VR he...

Please sign up or login with your details

Forgot password? Click here to reset