Real-Time Recognition of In-Place Body Actions and Head Gestures using Only a Head-Mounted Display

02/25/2023
by   Jingbo Zhao, et al.
0

Body actions and head gestures are natural interfaces for interaction in virtual environments. Existing methods for in-place body action recognition often require hardware more than a head-mounted display (HMD), making body action interfaces difficult to be introduced to ordinary virtual reality (VR) users as they usually only possess an HMD. In addition, there lacks a unified solution to recognize in-place body actions and head gestures. This potentially hinders the exploration of the use of in-place body actions and head gestures for novel interaction experiences in virtual environments. We present a unified two-stream 1-D convolutional neural network (CNN) for recognition of body actions when a user performs walking-in-place (WIP) and for recognition of head gestures when a user stands still wearing only an HMD. Compared to previous approaches, our method does not require specialized hardware and/or additional tracking devices other than an HMD and can recognize a significantly larger number of body actions and head gestures than other existing methods. In total, ten in-place body actions and eight head gestures can be recognized with the proposed method, which makes this method a readily available body action interface (head gestures included) for interaction with virtual environments. We demonstrate one utility of the interface through a virtual locomotion task. Results show that the present body action interface is reliable in detecting body actions for the VR locomotion task but is physically demanding compared to a touch controller interface. The present body action interface is promising for new VR experiences and applications, especially for VR fitness applications where workouts are intended.

READ FULL TEXT

page 6

page 7

research
02/11/2023

Evaluation of Virtual Reality Interaction Techniques: the case of 3D Graph

The virtual reality (VR) and human-computer interaction (HCI) combinatio...
research
09/13/2022

Neural3Points: Learning to Generate Physically Realistic Full-body Motion for Virtual Reality Users

Animating an avatar that reflects a user's action in the VR world enable...
research
02/21/2021

User interface for in-vehicle systems with on-wheel finger spreading gestures and head-up displays

Interacting with an in-vehicle system through a central console is known...
research
11/12/2019

Emerging Natural User Interfaces in Mobile Computing: A Bottoms-Up Survey

Mobile and wearable interfaces and interaction paradigms are highly cons...
research
10/22/2018

Visual Rendering of Shapes on 2D Display Devices Guided by Hand Gestures

Designing of touchless user interface is gaining popularity in various c...
research
01/24/2020

Touchless Typing using Head Movement-based Gestures

Physical contact-based typing interfaces are not suitable for people wit...
research
09/05/2016

Vision-based Engagement Detection in Virtual Reality

User engagement modeling for manipulating actions in vision-based interf...

Please sign up or login with your details

Forgot password? Click here to reset