Sharing Cognition: Human Gesture and Natural Language Grounding Based Planning and Navigation for Indoor Robots

08/14/2021
by   Gourav Kumar, et al.
0

Cooperation among humans makes it easy to execute tasks and navigate seamlessly even in unknown scenarios. With our individual knowledge and collective cognition skills, we can reason about and perform well in unforeseen situations and environments. To achieve a similar potential for a robot navigating among humans and interacting with them, it is crucial for it to acquire the ability for easy, efficient and natural ways of communication and cognition sharing with humans. In this work, we aim to exploit human gestures which is known to be the most prominent modality of communication after the speech. We demonstrate how the incorporation of gestures for communicating spatial understanding can be achieved in a very simple yet effective way using a robot having the vision and listening capability. This shows a big advantage over using only Vision and Language-based Navigation, Language Grounding or Human-Robot Interaction in a task requiring the development of cognition and indoor navigation. We adapt the state-of-the-art modules of Language Grounding and Human-Robot Interaction to demonstrate a novel system pipeline in real-world environments on a Telepresence robot for performing a set of challenging tasks. To the best of our knowledge, this is the first pipeline to couple the fields of HRI and language grounding in an indoor environment to demonstrate autonomous navigation.

READ FULL TEXT

page 1

page 3

page 4

page 6

research
06/15/2022

Body Gesture Recognition to Control a Social Robot

In this work, we propose a gesture based language to allow humans to int...
research
11/10/2020

Grounding Implicit Goal Description for Robot Indoor Navigation Via Recursive Belief Update

Natural language-based robotic navigation remains a challenging problem ...
research
01/30/2017

A Review of Methodologies for Natural-Language-Facilitated Human-Robot Cooperation

Natural-language-facilitated human-robot cooperation (NLC) refers to usi...
research
11/24/2017

Interactive Robot Learning of Gestures, Language and Affordances

A growing field in robotics and Artificial Intelligence (AI) research is...
research
07/31/2018

Extensible Grounding of Speech for Robot Instruction

Spoken language is a convenient interface for commanding a mobile robot....
research
07/06/2023

Recognition and Estimation of Human Finger Pointing with an RGB Camera for Robot Directive

In communication between humans, gestures are often preferred or complem...
research
09/27/2017

WHY: Natural Explanations from a Robot Navigator

Effective collaboration between a robot and a person requires natural co...

Please sign up or login with your details

Forgot password? Click here to reset