Generalizable Patch-Based Neural Rendering

07/21/2022
by   Mohammed Suhail, et al.
0

Neural rendering has received tremendous attention since the advent of Neural Radiance Fields (NeRF), and has pushed the state-of-the-art on novel-view synthesis considerably. The recent focus has been on models that overfit to a single scene, and the few attempts to learn models that can synthesize novel views of unseen scenes mostly consist of combining deep convolutional features with a NeRF-like model. We propose a different paradigm, where no deep features and no NeRF-like volume rendering are needed. Our method is capable of predicting the color of a target ray in a novel scene directly, just from a collection of patches sampled from the scene. We first leverage epipolar geometry to extract patches along the epipolar lines of each reference view. Each patch is linearly projected into a 1D feature vector and a sequence of transformers process the collection. For positional encoding, we parameterize rays as in a light field representation, with the crucial difference that the coordinates are canonicalized with respect to the target ray, which makes our method independent of the reference frame and improves generalization. We show that our approach outperforms the state-of-the-art on novel view synthesis of unseen scenes even when being trained with considerably less data than prior work.

READ FULL TEXT
research
05/15/2021

NeuLF: Efficient Novel View Synthesis with Neural 4D Light Field

In this paper, we present an efficient and robust deep learning solution...
research
08/22/2023

Enhancing NeRF akin to Enhancing LLMs: Generalizable NeRF Transformer with Mixture-of-View-Experts

Cross-scene generalizable NeRF models, which can directly synthesize nov...
research
05/16/2023

Ray-Patch: An Efficient Decoder for Light Field Transformers

In this paper we propose the Ray-Patch decoder, a novel model to efficie...
research
07/29/2022

End-to-end View Synthesis via NeRF Attention

In this paper, we present a simple seq2seq formulation for view synthesi...
research
01/25/2023

Ultra-NeRF: Neural Radiance Fields for Ultrasound Imaging

We present a physics-enhanced implicit neural representation (INR) for u...
research
12/06/2022

Ref-NPR: Reference-Based Non-Photorealistic Radiance Fields

Existing 3D scene stylization methods employ an arbitrary style referenc...
research
08/10/2022

Neural Mesh-Based Graphics

We revisit NPBG, the popular approach to novel view synthesis that intro...

Please sign up or login with your details

Forgot password? Click here to reset