Optimal Assistance for Object-Rearrangement Tasks in Augmented Reality

by   Benjamin Newman, et al.

Augmented-reality (AR) glasses that will have access to onboard sensors and an ability to display relevant information to the user present an opportunity to provide user assistance in quotidian tasks. Many such tasks can be characterized as object-rearrangement tasks. We introduce a novel framework for computing and displaying AR assistance that consists of (1) associating an optimal action sequence with the policy of an embodied agent and (2) presenting this sequence to the user as suggestions in the AR system's heads-up display. The embodied agent comprises a "hybrid" between the AR system and the user, with the AR system's observation space (i.e., sensors) and the user's action space (i.e., task-execution actions); its policy is learned by minimizing the task-completion time. In this initial study, we assume that the AR system's observations include the environment's map and localization of the objects and the user. These choices allow us to formalize the problem of computing AR assistance for any object-rearrangement task as a planning problem, specifically as a capacitated vehicle-routing problem. Further, we introduce a novel AR simulator that can enable web-based evaluation of AR-like assistance and associated at-scale data collection via the Habitat simulator for embodied artificial intelligence. Finally, we perform a study that evaluates user response to the proposed form of AR assistance on a specific quotidian object-rearrangement task, house cleaning, using our proposed AR simulator on mechanical turk. In particular, we study the effect of the proposed AR assistance on users' task performance and sense of agency over a range of task difficulties. Our results indicate that providing users with such assistance improves their overall performance and while users report a negative impact to their agency, they may still prefer the proposed assistance to having no assistance at all.



There are no comments yet.


page 8


Integrative Object and Pose to Task Detection for an Augmented-Reality-based Human Assistance System using Neural Networks

As a result of an increasingly automatized and digitized industry, proce...

Effects of Head-locked Augmented Reality on User's performance and perceived workload

An augmented reality (AR) environment includes a set of digital elements...

Beyond LunAR: An augmented reality UI for deep-space exploration missions

As space exploration efforts shift to deep space missions, new challenge...

Experimental Augmented Reality User Experience

Augmented Reality (AR) is an emerging field ripe for experimentation, es...

Deep Residual Network based food recognition for enhanced Augmented Reality application

Deep neural network based learning approaches is widely utilized for ima...

Assisted Perception: Optimizing Observations to Communicate State

We aim to help users estimate the state of the world in tasks like robot...

Personal+Context navigation: combining AR and shared displays in network path-following

Shared displays are well suited to public viewing and collaboration, how...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.