Autoregressive Conditional Neural Processes

03/25/2023
by   Wessel P. Bruinsma, et al.
0

Conditional neural processes (CNPs; Garnelo et al., 2018a) are attractive meta-learning models which produce well-calibrated predictions and are trainable via a simple maximum likelihood procedure. Although CNPs have many advantages, they are unable to model dependencies in their predictions. Various works propose solutions to this, but these come at the cost of either requiring approximate inference or being limited to Gaussian predictions. In this work, we instead propose to change how CNPs are deployed at test time, without any modifications to the model or training procedure. Instead of making predictions independently for every target point, we autoregressively define a joint predictive distribution using the chain rule of probability, taking inspiration from the neural autoregressive density estimator (NADE) literature. We show that this simple procedure allows factorised Gaussian CNPs to model highly dependent, non-Gaussian predictive distributions. Perhaps surprisingly, in an extensive range of tasks with synthetic and real data, we show that CNPs in autoregressive (AR) mode not only significantly outperform non-AR CNPs, but are also competitive with more sophisticated models that are significantly more computationally expensive and challenging to train. This performance is remarkable given that AR CNPs are not trained to model joint dependencies. Our work provides an example of how ideas from neural distribution estimation can benefit neural processes, and motivates research into the AR deployment of other neural process models.

READ FULL TEXT

page 9

page 19

page 29

research
08/22/2021

Efficient Gaussian Neural Processes for Regression

Conditional Neural Processes (CNP; Garnelo et al., 2018) are an attracti...
research
03/16/2022

Practical Conditional Neural Processes Via Tractable Dependent Predictions

Conditional Neural Processes (CNPs; Garnelo et al., 2018a) are meta-lear...
research
03/23/2023

Adversarially Contrastive Estimation of Conditional Neural Processes

Conditional Neural Processes (CNPs) formulate distributions over functio...
research
01/10/2021

The Gaussian Neural Process

Neural Processes (NPs; Garnelo et al., 2018a,b) are a rich class of mode...
research
02/21/2020

Deep Sigma Point Processes

We introduce Deep Sigma Point Processes, a class of parametric models in...
research
07/14/2022

Scene Text Recognition with Permuted Autoregressive Sequence Models

Context-aware STR methods typically use internal autoregressive (AR) lan...
research
02/12/2015

MADE: Masked Autoencoder for Distribution Estimation

There has been a lot of recent interest in designing neural network mode...

Please sign up or login with your details

Forgot password? Click here to reset