HeadText: Exploring Hands-free Text Entry using Head Gestures by Motion Sensing on a Smart Earpiece

by   Songlin Xu, et al.

We present HeadText, a hands-free technique on a smart earpiece for text entry by motion sensing. Users input text utilizing only 7 head gestures for key selection, word selection, word commitment and word cancelling tasks. Head gesture recognition is supported by motion sensing on a smart earpiece to capture head moving signals and machine learning algorithms (K-Nearest-Neighbor (KNN) with a Dynamic Time Warping (DTW) distance measurement). A 10-participant user study proved that HeadText could recognize 7 head gestures at an accuracy of 94.29 achieve a maximum accuracy of 10.65 WPM and an average accuracy of 9.84 WPM for text entry. Finally, we demonstrate potential applications of HeadText in hands-free scenarios for (a). text entry of people with motor impairments, (b). private text entry, and (c). socially acceptable text entry.


page 6

page 9


TeethTap: Recognizing Discrete Teeth Gestures Using Motion and Acoustic Sensing on an Earpiece

Teeth gestures become an alternative input modality for different situat...

AirDraw: Leveraging Smart Watch Motion Sensors for Mobile Human Computer Interactions

Wearable computing is one of the fastest growing technologies today. Sma...

Classifying Eyes-Free Mobile Authentication Techniques

Mobile device users avoiding observational attacks and coping with situa...

Finger Based Techniques for Nonvisual Touchscreen Text Entry

This research proposes Finger Based Technique (FBT) for non-visual touch...

OESense: Employing Occlusion Effect for In-ear Human Sensing

Smart earbuds are recognized as a new wearable platform for personal-sca...

Touchless Typing using Head Movement-based Gestures

Physical contact-based typing interfaces are not suitable for people wit...