Accurately and Efficiently Interpreting Human-Robot Instructions of Varying Granularities

04/21/2017
by   Dilip Arumugam, et al.
0

Humans can ground natural language commands to tasks at both abstract and fine-grained levels of specificity. For instance, a human forklift operator can be instructed to perform a high-level action, like "grab a pallet" or a lowlevel action like "tilt back a little bit." While robots are also capable of grounding language commands to tasks, previous methods implicitly assume that all commands and tasks reside at a single, fixed level of abstraction. Additionally, those approaches that do not use abstraction experience inefficient planning and execution times due to the large, intractable state-action spaces, which closely resemble real world complexity. In this work, by grounding commands to all the tasks or subtasks available in a hierarchical planning framework, we arrive at a model capable of interpreting language at multiple levels of specificity ranging from coarse to more granular. We show that the accuracy of the grounding procedure is improved when simultaneously inferring the degree of abstraction in language used to communicate the task. Leveraging hierarchy also improves efficiency: our proposed approach enables a robot to respond to a command within one second on 90 Finally, we demonstrate that a real, physical robot can ground commands at multiple levels of abstraction allowing it to efficiently plan different subtasks within the same planning hierarchy.

READ FULL TEXT
research
07/26/2017

A Tale of Two DRAGGNs: A Hybrid Approach for Interpreting Action-Oriented and Goal-Oriented Instructions

Robots operating alongside humans in diverse, stochastic environments mu...
research
06/27/2021

Draw Me a Flower: Grounding Formal Abstract Structures Stated in Informal Natural Language

Forming and interpreting abstraction is a core process in human communic...
research
05/27/2019

Value Iteration Networks on Multiple Levels of Abstraction

Learning-based methods are promising to plan robot motion without perfor...
research
10/07/2022

See, Plan, Predict: Language-guided Cognitive Planning with Video Prediction

Cognitive planning is the structural decomposition of complex tasks into...
research
07/05/2020

Unsupervised Online Grounding of Natural Language during Human-Robot Interactions

Allowing humans to communicate through natural language with robots requ...
research
07/17/2020

Toward Givenness Hierarchy Theoretic Natural Language Generation

Language-capable interactive robots participating in dialogues with huma...
research
07/18/2019

Learning High-Level Planning Symbols from Intrinsically Motivated Experience

In symbolic planning systems, the knowledge on the domain is commonly pr...

Please sign up or login with your details

Forgot password? Click here to reset