JacobiNeRF: NeRF Shaping with Mutual Information Gradients

04/01/2023
by   Xiaomeng Xu, et al.
0

We propose a method that trains a neural radiance field (NeRF) to encode not only the appearance of the scene but also semantic correlations between scene points, regions, or entities – aiming to capture their mutual co-variation patterns. In contrast to the traditional first-order photometric reconstruction objective, our method explicitly regularizes the learning dynamics to align the Jacobians of highly-correlated entities, which proves to maximize the mutual information between them under random scene perturbations. By paying attention to this second-order information, we can shape a NeRF to express semantically meaningful synergies when the network weights are changed by a delta along the gradient of a single entity, region, or even a point. To demonstrate the merit of this mutual information modeling, we leverage the coordinated behavior of scene entities that emerges from our shaping to perform label propagation for semantic and instance segmentation. Our experiments show that a JacobiNeRF is more efficient in propagating annotations among 2D pixels and 3D points compared to NeRFs without mutual information shaping, especially in extremely sparse label regimes – thus reducing annotation burden. The same machinery can further be used for entity selection or scene modifications.

READ FULL TEXT

page 1

page 4

page 5

page 7

page 12

page 13

page 14

research
09/14/2023

Estimating mutual information for spike trains: a bird song example

Zebra finch are a model animal used in the study of audition. They are a...
research
04/22/2019

Learning gradient-based ICA by neurally estimating mutual information

Several methods of estimating the mutual information of random variables...
research
07/01/2021

Unsupervised Image Segmentation by Mutual Information Maximization and Adversarial Regularization

Semantic segmentation is one of the basic, yet essential scene understan...
research
05/12/2020

Strong Asymptotic Composition Theorems for Sibson Mutual Information

We characterize the growth of the Sibson mutual information, of any orde...
research
03/23/2021

Pairwise Adjusted Mutual Information

A well-known metric for quantifying the similarity between two clusterin...
research
07/22/2019

Information-Bottleneck Approach to Salient Region Discovery

We propose a new method for learning image attention masks in a semi-sup...
research
05/27/2022

Finding Patterns in Visualized Data by Adding Redundant Visual Information

We present "PATRED", a technique that uses the addition of redundant inf...

Please sign up or login with your details

Forgot password? Click here to reset