NPBG++: Accelerating Neural Point-Based Graphics

03/24/2022
by   Ruslan Rakhimov, et al.
0

We present a new system (NPBG++) for the novel view synthesis (NVS) task that achieves high rendering realism with low scene fitting time. Our method efficiently leverages the multiview observations and the point cloud of a static scene to predict a neural descriptor for each point, improving upon the pipeline of Neural Point-Based Graphics in several important ways. By predicting the descriptors with a single pass through the source images, we lift the requirement of per-scene optimization while also making the neural descriptors view-dependent and more suitable for scenes with strong non-Lambertian effects. In our comparisons, the proposed system outperforms previous NVS approaches in terms of fitting and rendering runtimes while producing images of similar quality.

READ FULL TEXT

page 3

page 7

page 8

research
08/10/2022

Neural Mesh-Based Graphics

We revisit NPBG, the popular approach to novel view synthesis that intro...
research
06/19/2019

Neural Point-Based Graphics

We present a new point-based approach for modeling complex scenes. The a...
research
11/28/2022

Fragment-History Volumes

Hardware-based triangle rasterization is still the prevalent method for ...
research
09/06/2020

TRANSPR: Transparency Ray-Accumulating Neural 3D Scene Point Renderer

We propose and evaluate a neural point-based graphics method that can mo...
research
01/03/2023

Neural Point Catacaustics for Novel-View Synthesis of Reflections

View-dependent effects such as reflections pose a substantial challenge ...
research
12/10/2019

Neural Point Cloud Rendering via Multi-Plane Projection

We present a new deep point cloud rendering pipeline through multi-plane...
research
05/28/2022

Differentiable Point-Based Radiance Fields for Efficient View Synthesis

We propose a differentiable rendering algorithm for efficient novel view...

Please sign up or login with your details

Forgot password? Click here to reset