A Deep Learning Framework for Recognizing both Static and Dynamic Gestures

06/11/2020
by   Osama Mazhar, et al.
0

Intuitive user interfaces are indispensable to interact with human centric smart environments. In this paper, we propose a unified framework that recognizes both static and dynamic gestures, using simple RGB vision (without depth sensing). This feature makes it suitable for inexpensive human-machine interaction (HMI). We rely on a spatial attention-based strategy, which employs SaDNet, our proposed Static and Dynamic gestures Network. From the image of the human upper body, we estimate his/her depth, along with the region-of-interest around his/her hands. The Convolutional Neural Networks in SaDNet are fine-tuned on a background-substituted hand gestures dataset. They are utilized to detect 10 static gestures for each hand and to obtain hand image-embeddings from the last Fully Connected layer, which are subsequently fused with the augmented pose vector and then passed to stacked Long Short-Term Memory blocks. Thus, human-centered frame-wise information from the augmented pose vector and left/right hands image-embeddings are aggregated in time to predict the dynamic gestures of the performing person. In a number of experiments we show that the proposed approach surpasses the state-of-the-art results on large-scale Chalearn 2016 dataset. Moreover, we also transfer the knowledge learned through the proposed methodology to the Praxis gestures dataset, and the obtained results also outscore the state-of-the-art on this dataset.

READ FULL TEXT

page 2

page 4

page 7

page 8

research
04/13/2023

Online Recognition of Incomplete Gesture Data to Interface Collaborative Robots

Online recognition of gestures is critical for intuitive human-robot int...
research
03/03/2020

3D dynamic hand gestures recognition using the Leap Motion sensor and convolutional neural networks

Defining methods for the automatic understanding of gestures is of param...
research
10/22/2018

Visual Rendering of Shapes on 2D Display Devices Guided by Hand Gestures

Designing of touchless user interface is gaining popularity in various c...
research
12/16/2022

Fast Learning of Dynamic Hand Gesture Recognition with Few-Shot Learning Models

We develop Few-Shot Learning models trained to recognize five or ten dif...
research
03/02/2020

DriverMHG: A Multi-Modal Dataset for Dynamic Recognition of Driver Micro Hand Gestures and a Real-Time Recognition Framework

The use of hand gestures provides a natural alternative to cumbersome in...
research
07/23/2020

Body2Hands: Learning to Infer 3D Hands from Conversational Gesture Body Dynamics

We propose a novel learned deep prior of body motion for 3D hand shape s...
research
02/28/2023

Continuous interaction with a smart speaker via low-dimensional embeddings of dynamic hand pose

This paper presents a new continuous interaction strategy with visual fe...

Please sign up or login with your details

Forgot password? Click here to reset