An Augmented Reality Interaction Interface for Autonomous Drone

by   Chuhao Liu, et al.

Human drone interaction in autonomous navigation incorporates spatial interaction tasks, including reconstructed 3D map from the drone and human desired target position. Augmented Reality (AR) devices can be powerful interactive tools for handling these spatial interactions. In this work, we build an AR interface that displays the reconstructed 3D map from the drone on physical surfaces in front of the operator. Spatial target positions can be further set on the 3D map by intuitive head gaze and hand gesture. The AR interface is deployed to interact with an autonomous drone to explore an unknown environment. A user study is further conducted to evaluate the overall interaction performance.


page 1

page 2

page 3


AR Point Click: An Interface for Setting Robot Navigation Goals

This paper considers the problem of designating navigation goal location...

Implementation of Augmented Reality in Autonomous Warehouses: Challenges and Opportunities

Autonomous warehouses with mobile, rack-carrying robots are starting to ...

PalmGazer: Unimanual Eye-hand Menus in Augmented Reality

How can we design the user interfaces for augmented reality (AR) so that...

Psychoacoustic Sonification as User Interface for Human-Machine Interaction

When operating a machine, the operator needs to know some spatial relati...

Sonic Sculpture: Activating Engagement with Head-Mounted Augmented Reality

This work examines how head-mounted AR can be used to build an interacti...

User interface design for military AR applications

Designing a user interface for military situation awareness presents cha...

Chasing Lions: Co-Designing Human-Drone Interaction in Sub-Saharan Africa

Drones are an exciting technology that is quickly being adopted in the g...

Please sign up or login with your details

Forgot password? Click here to reset