The Gaussian Neural Process

01/10/2021
by   Wessel P. Bruinsma, et al.
48

Neural Processes (NPs; Garnelo et al., 2018a,b) are a rich class of models for meta-learning that map data sets directly to predictive stochastic processes. We provide a rigorous analysis of the standard maximum-likelihood objective used to train conditional NPs. Moreover, we propose a new member to the Neural Process family called the Gaussian Neural Process (GNP), which models predictive correlations, incorporates translation equivariance, provides universal approximation guarantees, and demonstrates encouraging performance.

READ FULL TEXT
research
07/02/2020

Meta-Learning Stationary Stochastic Process Prediction with Convolutional Neural Processes

Stationary stochastic processes (SPs) are a key component of many probab...
research
08/22/2021

Efficient Gaussian Neural Processes for Regression

Conditional Neural Processes (CNP; Garnelo et al., 2018) are an attracti...
research
03/16/2022

Practical Conditional Neural Processes Via Tractable Dependent Predictions

Conditional Neural Processes (CNPs; Garnelo et al., 2018a) are meta-lear...
research
09/01/2022

The Neural Process Family: Survey, Applications and Perspectives

The standard approaches to neural network implementation yield powerful ...
research
03/25/2023

Autoregressive Conditional Neural Processes

Conditional neural processes (CNPs; Garnelo et al., 2018a) are attractiv...
research
05/30/2023

Adaptive Conditional Quantile Neural Processes

Neural processes are a family of probabilistic models that inherit the f...
research
10/29/2019

Convolutional Conditional Neural Processes

We introduce the Convolutional Conditional Neural Process (ConvCNP), a n...

Please sign up or login with your details

Forgot password? Click here to reset