RigidFusion: Robot Localisation and Mapping in Environments with Large Dynamic Rigid Objects

10/21/2020 ∙ by Ran Long, et al. ∙ 0

This work presents a novel approach to simultaneously track a robot with respect to multiple rigid entities, including the environment and additional dynamic objects in a scene. Previous approaches treat dynamic parts of a scene as outliers and are thus limited to small amount of dynamics, or rely on prior information for all objects in the scene to enable robust camera tracking. Here, we propose to formulate localisation and object tracking as the same underlying problem and simultaneously track multiple rigid transformations, therefore enabling simultaneous localisation and object tracking for mobile manipulators in dynamic scenes. We evaluate our approach on multiple challenging dynamic scenes with large occlusions. The evaluation demonstrates that our approach achieves better scene segmentation and camera pose tracking in highly dynamic scenes without requiring knowledge of the dynamic object's appearance.



There are no comments yet.


page 1

page 3

page 5

page 7

page 8

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.