Attention Beats Concatenation for Conditioning Neural Fields

09/21/2022
by   Daniel Rebain, et al.
53

Neural fields model signals by mapping coordinate inputs to sampled values. They are becoming an increasingly important backbone architecture across many fields from vision and graphics to biology and astronomy. In this paper, we explore the differences between common conditioning mechanisms within these networks, an essential ingredient in shifting neural fields from memorization of signals to generalization, where the set of signals lying on a manifold is modelled jointly. In particular, we are interested in the scaling behaviour of these mechanisms to increasingly high-dimensional conditioning variables. As we show in our experiments, high-dimensional conditioning is key to modelling complex data distributions, thus it is important to determine what architecture choices best enable this when working on such problems. To this end, we run experiments modelling 2D, 3D, and 4D signals with neural fields, employing concatenation, hyper-network, and attention-based conditioning strategies – a necessary but laborious effort that has not been performed in the literature. We find that attention-based conditioning outperforms other approaches in a variety of settings.

READ FULL TEXT

page 8

page 9

page 10

page 11

research
04/26/2020

Learning and Testing Junta Distributions with Subcube Conditioning

We study the problems of learning and testing junta distributions on {-1...
research
08/01/2009

A Class of DSm Conditional Rules

In this paper we introduce two new DSm fusion conditioning rules with ex...
research
10/01/2020

Probabilistic Programs with Stochastic Conditioning

We tackle the problem of conditioning probabilistic programs on distribu...
research
10/08/2020

Uncertainty in Neural Processes

We explore the effects of architecture and training objective choice on ...
research
06/09/2019

Attention-based Conditioning Methods for External Knowledge Integration

In this paper, we present a novel approach for incorporating external kn...
research
06/10/2016

Conditional Generation and Snapshot Learning in Neural Dialogue Systems

Recently a variety of LSTM-based conditional language models (LM) have b...
research
02/04/2022

ℱ-EBM: Energy Based Learning of Functional Data

Energy-Based Models (EBMs) have proven to be a highly effective approach...

Please sign up or login with your details

Forgot password? Click here to reset