Visual Rendering of Shapes on 2D Display Devices Guided by Hand Gestures

10/22/2018
by   Abhik Singla, et al.
6

Designing of touchless user interface is gaining popularity in various contexts. Using such interfaces, users can interact with electronic devices even when the hands are dirty or non-conductive. Also, user with partial physical disability can interact with electronic devices using such systems. Research in this direction has got major boost because of the emergence of low-cost sensors such as Leap Motion, Kinect or RealSense devices. In this paper, we propose a Leap Motion controller-based methodology to facilitate rendering of 2D and 3D shapes on display devices. The proposed method tracks finger movements while users perform natural gestures within the field of view of the sensor. In the next phase, trajectories are analyzed to extract extended Npen++ features in 3D. These features represent finger movements during the gestures and they are fed to unidirectional left-to-right Hidden Markov Model (HMM) for training. A one-to-one mapping between gestures and shapes is proposed. Finally, shapes corresponding to these gestures are rendered over the display using MuPad interface. We have created a dataset of 5400 samples recorded by 10 volunteers. Our dataset contains 18 geometric and 18 non-geometric shapes such as "circle", "rectangle", "flower", "cone", "sphere" etc. The proposed methodology achieves an accuracy of 92.87 using 5-fold cross validation method. Our experiments revel that the extended 3D features perform better than existing 3D features in the context of shape representation and classification. The method can be used for developing useful HCI applications for smart display devices.

READ FULL TEXT

page 7

page 8

page 9

page 16

page 20

page 24

page 26

page 32

research
06/23/2016

Human Computer Interaction Using Marker Based Hand Gesture Recognition

Human Computer Interaction (HCI) has been redefined in this era. People ...
research
06/11/2020

A Deep Learning Framework for Recognizing both Static and Dynamic Gestures

Intuitive user interfaces are indispensable to interact with human centr...
research
02/25/2023

Real-Time Recognition of In-Place Body Actions and Head Gestures using Only a Head-Mounted Display

Body actions and head gestures are natural interfaces for interaction in...
research
10/26/2016

Body movement to sound interface with vector autoregressive hierarchical hidden Markov models

Interfacing a kinetic action of a person to an action of a machine syste...
research
02/21/2021

User interface for in-vehicle systems with on-wheel finger spreading gestures and head-up displays

Interacting with an in-vehicle system through a central console is known...
research
09/17/2023

Trajectory Forecasting with Loose Clothing Using Left-to-Right Hidden Markov Model

Trajectory forecasting has become an interesting research area driven by...
research
01/24/2020

Touchless Typing using Head Movement-based Gestures

Physical contact-based typing interfaces are not suitable for people wit...

Please sign up or login with your details

Forgot password? Click here to reset