Urban Radiance Field Representation with Deformable Neural Mesh Primitives

07/20/2023
by   Fan Lu, et al.
0

Neural Radiance Fields (NeRFs) have achieved great success in the past few years. However, most current methods still require intensive resources due to ray marching-based rendering. To construct urban-level radiance fields efficiently, we design Deformable Neural Mesh Primitive~(DNMP), and propose to parameterize the entire scene with such primitives. The DNMP is a flexible and compact neural variant of classic mesh representation, which enjoys both the efficiency of rasterization-based rendering and the powerful neural representation capability for photo-realistic image synthesis. Specifically, a DNMP consists of a set of connected deformable mesh vertices with paired vertex features to parameterize the geometry and radiance information of a local area. To constrain the degree of freedom for optimization and lower the storage budgets, we enforce the shape of each primitive to be decoded from a relatively low-dimensional latent space. The rendering colors are decoded from the vertex features (interpolated with rasterization) by a view-dependent MLP. The DNMP provides a new paradigm for urban-level scene representation with appealing properties: $(1)$ High-quality rendering. Our method achieves leading performance for novel view synthesis in urban scenarios. $(2)$ Low computational costs. Our representation enables fast rendering (2.07ms/1k pixels) and low peak memory usage (110MB/1k pixels). We also present a lightweight version that can run 33$\times$ faster than vanilla NeRFs, and comparable to the highly-optimized Instant-NGP (0.61 vs 0.71ms/1k pixels). Project page: \href{https://dnmp.github.io/}{https://dnmp.github.io/}.

READ FULL TEXT

page 1

page 4

page 6

page 7

page 8

research
04/07/2022

ProbNVS: Fast Novel View Synthesis with Learned Probability-Guided Sampling

Existing state-of-the-art novel view synthesis methods rely on either fa...
research
08/10/2022

Neural Mesh-Based Graphics

We revisit NPBG, the popular approach to novel view synthesis that intro...
research
04/28/2022

NeurMiPs: Neural Mixture of Planar Experts for View Synthesis

We present Neural Mixtures of Planar Experts (NeurMiPs), a novel planar-...
research
07/07/2023

Blocks2World: Controlling Realistic Scenes with Editable Primitives

We present Blocks2World, a novel method for 3D scene rendering and editi...
research
04/20/2023

Learning Neural Duplex Radiance Fields for Real-Time View Synthesis

Neural radiance fields (NeRFs) enable novel view synthesis with unpreced...
research
12/13/2021

HVH: Learning a Hybrid Neural Volumetric Representation for Dynamic Hair Performance Capture

Capturing and rendering life-like hair is particularly challenging due t...
research
07/23/2023

TransHuman: A Transformer-based Human Representation for Generalizable Neural Human Rendering

In this paper, we focus on the task of generalizable neural human render...

Please sign up or login with your details

Forgot password? Click here to reset