Touchless Typing using Head Movement-based Gestures

01/24/2020
by   Shivam Rustagi, et al.
0

Physical contact-based typing interfaces are not suitable for people with upper limb disabilities such as Quadriplegia. This paper, thus, proposes a touch-less typing interface that makes use of an on-screen QWERTY keyboard and a front-facing smartphone camera mounted on a stand. The keys of the keyboard are grouped into nine color-coded clusters. Users pointed to the letters that they wanted to type just by moving their head. The head movements of the users are recorded by the camera. The recorded gestures are then translated into a cluster sequence. The translation module is implemented using CNN-RNN, Conv3D, and a modified GRU based model that uses pre-trained embedding rich in head pose features. The performances of these models were evaluated under four different scenarios on a dataset of 2234 video sequences collected from 22 users. The modified GRU-based model outperforms the standard CNN-RNN and Conv3D models for three of the four scenarios. The results are encouraging and suggest promising directions for future research.

READ FULL TEXT

page 5

page 8

research
11/03/2021

A Comparison of Deep Learning Models for the Prediction of Hand Hygiene Videos

This paper presents a comparison of various deep learning models such as...
research
02/12/2022

"I Don't Want People to Look At Me Differently": Designing User-Defined Above-the-Neck Gestures for People with Upper Body Motor Impairments

Recent research proposed eyelid gestures for people with upper-body moto...
research
02/25/2023

Real-Time Recognition of In-Place Body Actions and Head Gestures using Only a Head-Mounted Display

Body actions and head gestures are natural interfaces for interaction in...
research
01/10/2020

Recognition and Localisation of Pointing Gestures using a RGB-D Camera

Non-verbal communication is part of our regular conversation, and multip...
research
05/20/2022

HeadText: Exploring Hands-free Text Entry using Head Gestures by Motion Sensing on a Smart Earpiece

We present HeadText, a hands-free technique on a smart earpiece for text...
research
12/18/2018

Mobile Head Tracking for eCommerce and Beyond

Shopping is difficult for people with motor impairments. This includes o...
research
10/22/2018

Visual Rendering of Shapes on 2D Display Devices Guided by Hand Gestures

Designing of touchless user interface is gaining popularity in various c...

Please sign up or login with your details

Forgot password? Click here to reset