Vision-based Engagement Detection in Virtual Reality

09/05/2016
by   Ghassem Tofighi, et al.
0

User engagement modeling for manipulating actions in vision-based interfaces is one of the most important case studies of user mental state detection. In a Virtual Reality environment that employs camera sensors to recognize human activities, we have to know when user intends to perform an action and when not. Without a proper algorithm for recognizing engagement status, any kind of activities could be interpreted as manipulating actions, called "Midas Touch" problem. Baseline approach for solving this problem is activating gesture recognition system using some focus gestures such as waiving or raising hand. However, a desirable natural user interface should be able to understand user's mental status automatically. In this paper, a novel multi-modal model for engagement detection, DAIA, is presented. using DAIA, the spectrum of mental status for performing an action is quantized in a finite number of engagement states. For this purpose, a Finite State Transducer (FST) is designed. This engagement framework shows how to integrate multi-modal information from user biometric data streams such as 2D and 3D imaging. FST is employed to make the state transition smoothly using combination of several boolean expressions. Our FST true detection rate is 92.3 also show FST can segment user hand gestures more robustly.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/31/2016

Engagement Detection in Meetings

Group meetings are frequent business events aimed to develop and conduct...
research
04/04/2015

Preprint Extending Touch-less Interaction on Vision Based Wearable Device

This is the preprint version of our paper on IEEE Virtual Reality Confer...
research
04/25/2021

Comparing Hand Gestures and the Gamepad Interfaces for Locomotion in Virtual Environments

Hand gesture is a new and promising interface for locomotion in virtual ...
research
03/16/2023

Effect of Haptic Assistance Strategy on Mental Engagement in Fine Motor Tasks

This study investigates the effect of haptic control strategies on a sub...
research
02/25/2023

Real-Time Recognition of In-Place Body Actions and Head Gestures using Only a Head-Mounted Display

Body actions and head gestures are natural interfaces for interaction in...
research
10/11/2022

A Perception-Driven Approach To Immersive Remote Telerobotics

Virtual Reality (VR) interfaces are increasingly used as remote visualiz...
research
03/01/2022

WEMAC: Women and Emotion Multi-modal Affective Computing dataset

Among the seventeen Sustainable Development Goals (SDGs) proposed within...

Please sign up or login with your details

Forgot password? Click here to reset