An Exploration of Neural Radiance Field Scene Reconstruction: Synthetic, Real-world and Dynamic Scenes

10/21/2022
by   Benedict Quartey, et al.
0

This project presents an exploration into 3D scene reconstruction of synthetic and real-world scenes using Neural Radiance Field (NeRF) approaches. We primarily take advantage of the reduction in training and rendering time of neural graphic primitives multi-resolution hash encoding, to reconstruct static video game scenes and real-world scenes, comparing and observing reconstruction detail and limitations. Additionally, we explore dynamic scene reconstruction using Neural Radiance Fields for Dynamic Scenes(D-NeRF). Finally, we extend the implementation of D-NeRF, originally constrained to handle synthetic scenes to also handle real-world dynamic scenes.

READ FULL TEXT

page 3

page 4

research
11/20/2020

Neural Scene Graphs for Dynamic Scenes

Recent implicit neural rendering methods have demonstrated that it is po...
research
01/24/2023

K-Planes: Explicit Radiance Fields in Space, Time, and Appearance

We introduce k-planes, a white-box model for radiance fields in arbitrar...
research
04/01/2023

Multi-view reconstruction of bullet time effect based on improved NSFF model

Bullet time is a type of visual effect commonly used in film, television...
research
10/10/2022

NeRF2Real: Sim2real Transfer of Vision-guided Bipedal Motion Skills using Neural Radiance Fields

We present a system for applying sim2real approaches to "in the wild" sc...
research
06/13/2023

Binary Radiance Fields

In this paper, we propose binary radiance fields (BiRF), a storage-effic...
research
12/10/2021

PERF: Performant, Explicit Radiance Fields

We present a novel way of approaching image-based 3D reconstruction base...
research
08/30/2023

Drone-NeRF: Efficient NeRF Based 3D Scene Reconstruction for Large-Scale Drone Survey

Neural rendering has garnered substantial attention owing to its capacit...

Please sign up or login with your details

Forgot password? Click here to reset