WristSketcher: Creating Dynamic Sketches in AR with a Sensing Wristband

10/21/2022
by   Enting Ying, et al.
0

Restricted by the limited interaction area of native AR glasses (e.g., touch bars), it is challenging to create sketches in AR glasses. Recent works have attempted to use mobile devices (e.g., tablets) or mid-air bare-hand gestures to expand the interactive spaces and can work as the 2D/3D sketching input interfaces for AR glasses. Between them, mobile devices allow for accurate sketching but are often heavy to carry, while sketching with bare hands is zero-burden but can be inaccurate due to arm instability. In addition, mid-air bare-hand sketching can easily lead to social misunderstandings and its prolonged use can cause arm fatigue. As a new attempt, in this work, we present WristSketcher, a new AR system based on a flexible sensing wristband for creating 2D dynamic sketches, featuring an almost zero-burden authoring model for accurate and comfortable sketch creation in real-world scenarios. Specifically, we have streamlined the interaction space from the mid-air to the surface of a lightweight sensing wristband, and implemented AR sketching and associated interaction commands by developing a gesture recognition method based on the sensing pressure points on the wristband. The set of interactive gestures used by our WristSketcher is determined by a heuristic study on user preferences. Moreover, we endow our WristSketcher with the ability of animation creation, allowing it to create dynamic and expressive sketches. Experimental results demonstrate that our WristSketcher i) faithfully recognizes users' gesture interactions with a high accuracy of 96.0 sketching accuracy than Freehand sketching; iii) achieves high user satisfaction in ease of use, usability and functionality; and iv) shows innovation potentials in art creation, memory aids, and entertainment applications.

READ FULL TEXT

page 3

page 4

page 6

page 7

page 8

page 9

research
07/29/2019

Hand-Gesture-Recognition Based Text Input Method for AR/VR Wearable Devices

Static and dynamic hand movements are basic way for human-machine intera...
research
04/12/2019

AirPen: A Touchless Fingertip Based Gestural Interface for Smartphones and Head-Mounted Devices

Hand gestures are an intuitive, socially acceptable, and a non-intrusive...
research
08/16/2018

Egocentric Gesture Recognition for Head-Mounted AR devices

Natural interaction with virtual objects in AR/VR environments makes for...
research
01/16/2020

LE-HGR: A Lightweight and Efficient RGB-based Online Gesture Recognition Network for Embedded AR Devices

Online hand gesture recognition (HGR) techniques are essential in augmen...
research
08/19/2020

RealitySketch: Embedding Responsive Graphics and Visualizations in AR through Dynamic Sketching

We present RealitySketch, an augmented reality interface for sketching i...
research
02/10/2021

ParmoSense: A Scenario-based Participatory Mobile Urban Sensing Platform with User Motivation Engine

Rapid proliferation of mobile devices with various sensors have enabled ...
research
09/07/2020

Back to the Future: Revisiting Mouse and Keyboard Interaction for HMD-based Immersive Analytics

With the rise of natural user interfaces, immersive analytics applicatio...

Please sign up or login with your details

Forgot password? Click here to reset