HOOV: Hand Out-Of-View Tracking for Proprioceptive Interaction using Inertial Sensing

03/13/2023
by   Paul Streli, et al.
2

Current Virtual Reality systems are designed for interaction under visual control. Using built-in cameras, headsets track the user's hands or hand-held controllers while they are inside the field of view. Current systems thus ignore the user's interaction with off-screen content – virtual objects that the user could quickly access through proprioception without requiring laborious head motions to bring them into focus. In this paper, we present HOOV, a wrist-worn sensing method that allows VR users to interact with objects outside their field of view. Based on the signals of a single wrist-worn inertial sensor, HOOV continuously estimates the user's hand position in 3-space to complement the headset's tracking as the hands leave the tracking range. Our novel data-driven method predicts hand positions and trajectories from just the continuous estimation of hand orientation, which by itself is stable based solely on inertial observations. Our inertial sensing simultaneously detects finger pinching to register off-screen selection events, confirms them using a haptic actuator inside our wrist device, and thus allows users to select, grab, and drop virtual content. We compared HOOV's performance with a camera-based optical motion capture system in two folds. In the first evaluation, participants interacted based on tracking information from the motion capture system to assess the accuracy of their proprioceptive input, whereas in the second, they interacted based on HOOV's real-time estimations. We found that HOOV's target-agnostic estimations had a mean tracking error of 7.7 cm, which allowed participants to reliably access virtual objects around their body without first bringing them into focus. We demonstrate several applications that leverage the larger input space HOOV opens up for quick proprioceptive interaction, and conclude by discussing the potential of our technique.

READ FULL TEXT

page 1

page 5

page 6

page 8

page 10

page 11

page 14

research
07/27/2022

AvatarPoser: Articulated Full-Body Pose Tracking from Sparse Motion Sensing

Today's Mixed Reality head-mounted displays track the user's head pose i...
research
08/17/2020

PAR: Personal Activity Radius Camera View for Contextual Sensing

Contextual sensing using wearable cameras has seen a variety of differen...
research
05/18/2020

Designing Just-in-Time Detection for Gamified Fitness Frameworks

This paper presents our findings from a multi-year effort to detect moti...
research
02/23/2023

FingerMapper: Mapping Finger Motions onto Virtual Arms to Enable Safe Virtual Reality Interaction in Confined Spaces

Whole-body movements enhance the presence and enjoyment of Virtual Reali...
research
09/23/2016

EgoCap: Egocentric Marker-less Motion Capture with Two Fisheye Cameras

Marker-based and marker-less optical skeletal motion-capture methods use...
research
12/15/2022

DOPAMINE: Doppler frequency and Angle of arrival MINimization of tracking Error for extended reality

In this paper, we investigate how Joint Communication And Sensing (JCAS)...
research
03/20/2022

FaceOri: Tracking Head Position and Orientation Using Ultrasonic Ranging on Earphones

Face orientation can often indicate users' intended interaction target. ...

Please sign up or login with your details

Forgot password? Click here to reset