Learned Initializations for Optimizing Coordinate-Based Neural Representations

12/03/2020
by   Matthew Tancik, et al.
4

Coordinate-based neural representations have shown significant promise as an alternative to discrete, array-based representations for complex low dimensional signals. However, optimizing a coordinate-based network from randomly initialized weights for each new signal is inefficient. We propose applying standard meta-learning algorithms to learn the initial weight parameters for these fully-connected networks based on the underlying class of signals being represented (e.g., images of faces or 3D models of chairs). Despite requiring only a minor change in implementation, using these learned initial weights enables faster convergence during optimization and can serve as a strong prior over the signal class being modeled, resulting in better generalization when only partial observations of a given signal are available. We explore these benefits across a variety of tasks, including representing 2D images, reconstructing CT scans, and recovering 3D shapes and scenes from 2D image observations.

READ FULL TEXT

page 1

page 2

page 3

page 5

page 6

page 9

page 10

page 11

research
09/13/2023

Generalizable Neural Fields as Partially Observed Neural Processes

Neural fields, which represent signals as a function parameterized by a ...
research
07/26/2021

H3D-Net: Few-Shot High-Fidelity 3D Head Reconstruction

Recent learning approaches that implicitly represent surface geometry us...
research
02/02/2023

Factor Fields: A Unified Framework for Neural Fields and Beyond

We present Factor Fields, a novel framework for modeling and representin...
research
06/17/2020

MetaSDF: Meta-learning Signed Distance Functions

Neural implicit shape representations are an emerging paradigm that offe...
research
05/18/2022

Meta-Learning Sparse Compression Networks

Recent work in Deep Learning has re-imagined the representation of data ...
research
10/03/2022

Random Weight Factorization Improves the Training of Continuous Neural Representations

Continuous neural representations have recently emerged as a powerful an...
research
11/23/2022

Generalizable Implicit Neural Representations via Instance Pattern Composers

Despite recent advances in implicit neural representations (INRs), it re...

Please sign up or login with your details

Forgot password? Click here to reset