RGB-D Mapping and Tracking in a Plenoxel Radiance Field

07/07/2023
by   Andreas L. Teigen, et al.
0

Building on the success of Neural Radiance Fields (NeRFs), recent years have seen significant advances in the domain of novel view synthesis. These models capture the scene's volumetric radiance field, creating highly convincing dense photorealistic models through the use of simple, differentiable rendering equations. Despite their popularity, these algorithms suffer from severe ambiguities in visual data inherent to the RGB sensor, which means that although images generated with view synthesis can visually appear very believable, the underlying 3D model will often be wrong. This considerably limits the usefulness of these models in practical applications like Robotics and Extended Reality (XR), where an accurate dense 3D reconstruction otherwise would be of significant value. In this technical report, we present the vital differences between view synthesis models and 3D reconstruction models. We also comment on why a depth sensor is essential for modeling accurate geometry in general outward-facing scenes using the current paradigm of novel view synthesis methods. Focusing on the structure-from-motion task, we practically demonstrate this need by extending the Plenoxel radiance field model: Presenting an analytical differential approach for dense mapping and tracking with radiance fields based on RGB-D data without a neural network. Our method achieves state-of-the-art results in both the mapping and tracking tasks while also being faster than competing neural network-based approaches.

READ FULL TEXT

page 1

page 3

page 4

research
04/06/2023

DeLiRa: Self-Supervised Depth, Light, and Radiance Fields

Differentiable volumetric rendering is a powerful paradigm for 3D recons...
research
03/16/2023

NeRFMeshing: Distilling Neural Radiance Fields into Geometrically-Accurate 3D Meshes

With the introduction of Neural Radiance Fields (NeRFs), novel view synt...
research
08/18/2023

Dynamic 3D Gaussians: Tracking by Persistent Dynamic View Synthesis

We present a method that simultaneously addresses the tasks of dynamic s...
research
04/09/2021

Neural RGB-D Surface Reconstruction

In this work, we explore how to leverage the success of implicit novel v...
research
02/17/2023

MixNeRF: Modeling a Ray with Mixture Density for Novel View Synthesis from Sparse Inputs

Neural Radiance Field (NeRF) has broken new ground in the novel view syn...
research
05/26/2023

PlaNeRF: SVD Unsupervised 3D Plane Regularization for NeRF Large-Scale Scene Reconstruction

Neural Radiance Fields (NeRF) enable 3D scene reconstruction from 2D ima...
research
05/28/2022

V4D: Voxel for 4D Novel View Synthesis

Neural radiance fields have made a remarkable breakthrough in the novel ...

Please sign up or login with your details

Forgot password? Click here to reset