NeO 360: Neural Fields for Sparse View Synthesis of Outdoor Scenes

08/24/2023
by   Muhammad Zubair Irshad, et al.
0

Recent implicit neural representations have shown great results for novel view synthesis. However, existing methods require expensive per-scene optimization from many views hence limiting their application to real-world unbounded urban settings where the objects of interest or backgrounds are observed from very few views. To mitigate this challenge, we introduce a new approach called NeO 360, Neural fields for sparse view synthesis of outdoor scenes. NeO 360 is a generalizable method that reconstructs 360 scenes from a single or a few posed RGB images. The essence of our approach is in capturing the distribution of complex real-world outdoor 3D scenes and using a hybrid image-conditional triplanar representation that can be queried from any world point. Our representation combines the best of both voxel-based and bird's-eye-view (BEV) representations and is more effective and expressive than each. NeO 360's representation allows us to learn from a large collection of unbounded 3D scenes while offering generalizability to new views and novel scenes from as few as a single image during inference. We demonstrate our approach on the proposed challenging 360 unbounded dataset, called NeRDS 360, and show that NeO 360 outperforms state-of-the-art generalizable methods for novel view synthesis while also offering editing and composition capabilities. Project page: https://zubair-irshad.github.io/projects/neo360.html

READ FULL TEXT

page 1

page 2

page 4

page 8

page 9

page 15

page 16

page 17

research
12/03/2020

pixelNeRF: Neural Radiance Fields from One or Few Images

We propose pixelNeRF, a learning framework that predicts a continuous ne...
research
07/05/2022

SNeRF: Stylized Neural Implicit Representations for 3D Scenes

This paper presents a stylized novel view synthesis method. Applying sta...
research
11/29/2021

Urban Radiance Fields

The goal of this work is to perform 3D reconstruction and novel view syn...
research
06/26/2023

Self-supervised novel 2D view synthesis of large-scale scenes with efficient multi-scale voxel carving

The task of generating novel views of real scenes is increasingly import...
research
03/28/2023

F^2-NeRF: Fast Neural Radiance Field Training with Free Camera Trajectories

This paper presents a novel grid-based NeRF called F2-NeRF (Fast-Free-Ne...
research
09/02/2022

CLONeR: Camera-Lidar Fusion for Occupancy Grid-aided Neural Representations

Recent advances in neural radiance fields (NeRFs) achieve state-of-the-a...
research
07/22/2022

InfiniteNature-Zero: Learning Perpetual View Generation of Natural Scenes from Single Images

We present a method for learning to generate unbounded flythrough videos...

Please sign up or login with your details

Forgot password? Click here to reset