Smooth head tracking for virtual reality applications

10/27/2021
by   Abdenour Amamra, et al.
0

In this work, we propose a new head-tracking solution for human-machine real-time interaction with virtual 3D environments. This solution leverages RGBD data to compute virtual camera pose according to the movements of the user's head. The process starts with the extraction of a set of facial features from the images delivered by the sensor. Such features are matched against their respective counterparts in a reference image for the computation of the current head pose. Afterwards, a prediction approach is used to guess the most likely next head move (final pose). Pythagorean Hodograph interpolation is then adapted to determine the path and local frames taken between the two poses. The result is a smooth head trajectory that serves as an input to set the camera in virtual scenes according to the user's gaze. The resulting motion model has the advantage of being: continuous in time, it adapts to any frame rate of rendering; it is ergonomic, as it frees the user from wearing tracking markers; it is smooth and free from rendering jerks; and it is also torsion and curvature minimizing as it produces a path with minimum bending energy.

READ FULL TEXT
research
12/20/2016

A Statistical Approach to Continuous Self-Calibrating Eye Gaze Tracking for Head-Mounted Virtual Reality Systems

We present a novel, automatic eye gaze tracking scheme inspired by smoot...
research
11/01/2019

Audience measurement using a top-view camera and oriented trajectories

A crucial aspect for selecting optimal areas for commercial advertising ...
research
07/27/2022

AvatarPoser: Articulated Full-Body Pose Tracking from Sparse Motion Sensing

Today's Mixed Reality head-mounted displays track the user's head pose i...
research
12/18/2018

Mobile Head Tracking for eCommerce and Beyond

Shopping is difficult for people with motor impairments. This includes o...
research
09/23/2021

Low-Latency Immersive 6D Televisualization with Spherical Rendering

We present a method for real-time stereo scene capture and remote VR vis...
research
05/03/2019

Steadiface: Real-Time Face-Centric Stabilization on Mobile Phones

We present Steadiface, a new real-time face-centric video stabilization ...
research
03/28/2023

Perceptual Requirements for World-Locked Rendering in AR and VR

Stereoscopic, head-tracked display systems can show users realistic, wor...

Please sign up or login with your details

Forgot password? Click here to reset