Volumetric Data Fusion of External Depth and Onboard Proximity Data For Occluded Space Reduction

10/21/2021
by   Matthew Strong, et al.
0

In this work, we present a method for a probabilistic fusion of external depth and onboard proximity data to form a volumetric 3-D map of a robot's environment. We extend the Octomap framework to update a representation of the area around the robot, dependent on each sensor's optimal range of operation. Areas otherwise occluded from an external view are sensed with onboard sensors to construct a more comprehensive map of a robot's nearby space. Our simulated results show that a more accurate map with less occlusions can be generated by fusing external depth and onboard proximity data.

READ FULL TEXT
research
04/04/2017

OctNetFusion: Learning Depth Fusion from Data

In this paper, we present a learning based approach to depth fusion, i.e...
research
09/02/2019

Learned Semantic Multi-Sensor Depth Map Fusion

Volumetric depth map fusion based on truncated signed distance functions...
research
05/09/2023

ProxMaP: Proximal Occupancy Map Prediction for Efficient Indoor Robot Navigation

In a typical path planning pipeline for a ground robot, we build a map (...
research
08/31/2021

Through the Looking Glass: Diminishing Occlusions in Robot Vision Systems with Mirror Reflections

The quality of robot vision greatly affects the performance of automatio...
research
08/17/2023

V-FUSE: Volumetric Depth Map Fusion with Long-Range Constraints

We introduce a learning-based depth map fusion framework that accepts a ...
research
06/05/2020

MRFMap: Online Probabilistic 3D Mapping using Forward Ray Sensor Models

Traditional dense volumetric representations for robotic mapping make si...
research
04/05/2020

Curved patch mapping and tracking for irregular terrain modeling: Application to bipedal robot foot placement

Legged robots need to make contact with irregular surfaces, when operati...

Please sign up or login with your details

Forgot password? Click here to reset