Context Aware Robot Navigation using Interactively Built Semantic Maps

10/24/2017
by   Akansel Cosgun, et al.
0

We discuss the process of building semantic maps, how to interactively label entities in it, and use them to enable new navigation behaviors for specific scenarios. We utilize planar surfaces such as walls and tables, and static objects such as door signs as features to our semantic mapping approach. Users can interactively annotate these features by having the robot follow him/her, entering the label through a mobile app and performing a pointing gesture toward the landmark of interest. These landmarks can later be used to generate context-aware motions. Our pointing gesture approach can reliably estimate the target object using human joint positions and detect ambiguous gestures with probabilistic modeling. Our person following method attempts to maximize future utility by searching future actions, assuming constant velocity model for the human. We describe a simple method to extract metric goals from a semantic map landmark and present a human aware path planner that considers the personal spaces of people to generate socially-aware paths. Finally, we demonstrate context-awareness for person following in two scenarios: interactive labeling and door passing. We believe as the sensing technology improves and maps with richer semantic information becomes commonplace, it would create new opportunities for intelligent navigation algorithms.

READ FULL TEXT

page 1

page 5

page 7

page 9

page 10

page 13

page 14

page 18

research
11/04/2021

Speed Maps: An Application to Guide Robots in Human Environments

We present the concept of speed maps: speed limits for mobile robots in ...
research
01/24/2023

Context-aware robot control using gesture episodes

Collaborative robots became a popular tool for increasing productivity i...
research
10/17/2022

Predicting Dense and Context-aware Cost Maps for Semantic Robot Navigation

We investigate the task of object goal navigation in unknown environment...
research
09/17/2023

LivelySpeaker: Towards Semantic-Aware Co-Speech Gesture Generation

Gestures are non-verbal but important behaviors accompanying people's sp...
research
06/18/2021

Semantic navigation with domain knowledge

Several deployment locations of mobile robotic systems are human made (i...
research
04/21/2021

Semantic Navigation Using Building Information on Construction Sites

With the growth in automated data collection of construction projects, t...
research
11/07/2019

An Agent-Based Intelligent HCI Information System in Mixed Reality

This paper presents a design of agent-based intelligent HCI (iHCI) syste...

Please sign up or login with your details

Forgot password? Click here to reset