Collaborative Visual Inertial SLAM for Multiple Smart Phones

06/23/2021
by   Jialing Liu, et al.
0

The efficiency and accuracy of mapping are crucial in a large scene and long-term AR applications. Multi-agent cooperative SLAM is the precondition of multi-user AR interaction. The cooperation of multiple smart phones has the potential to improve efficiency and robustness of task completion and can complete tasks that a single agent cannot do. However, it depends on robust communication, efficient location detection, robust mapping, and efficient information sharing among agents. We propose a multi-intelligence collaborative monocular visual-inertial SLAM deployed on multiple ios mobile devices with a centralized architecture. Each agent can independently explore the environment, run a visual-inertial odometry module online, and then send all the measurement information to a central server with higher computing resources. The server manages all the information received, detects overlapping areas, merges and optimizes the map, and shares information with the agents when needed. We have verified the performance of the system in public datasets and real environments. The accuracy of mapping and fusion of the proposed system is comparable to VINS-Mono which requires higher computing resources.

READ FULL TEXT
research
08/12/2021

COVINS: Visual-Inertial SLAM for Centralized Collaboration

Collaborative SLAM enables a group of agents to simultaneously co-locali...
research
05/31/2020

VIR-SLAM: Visual, Inertial, and Ranging SLAM for single and multi-robot systems

Monocular cameras coupled with inertial measurements generally give high...
research
05/25/2022

Wildcat: Online Continuous-Time 3D Lidar-Inertial SLAM

We present Wildcat, a novel online 3D lidar-inertial SLAM system with ex...
research
01/17/2023

COVINS-G: A Generic Back-end for Collaborative Visual-Inertial SLAM

Collaborative SLAM is at the core of perception in multi-robot systems a...
research
07/06/2022

RoVaR: Robust Multi-agent Tracking through Dual-layer Diversity in Visual and RF Sensor Fusion

The plethora of sensors in our commodity devices provides a rich substra...
research
11/30/2020

Vulcan Centaur: towards end-to-end real-time perception in lunar rovers

We introduce a new real-time pipeline for Simultaneous Localization and ...
research
09/08/2023

Depth Completion with Multiple Balanced Bases and Confidence for Dense Monocular SLAM

Dense SLAM based on monocular cameras does indeed have immense applicati...

Please sign up or login with your details

Forgot password? Click here to reset