Neural Processes

07/04/2018
by   Marta Garnelo, et al.
6

A neural network (NN) is a parameterised function that can be tuned via gradient descent to approximate a labelled collection of data with high precision. A Gaussian process (GP), on the other hand, is a probabilistic model that defines a distribution over possible functions, and is updated in light of data via the rules of probabilistic inference. GPs are probabilistic, data-efficient and flexible, however they are also computationally intensive and thus limited in their applicability. We introduce a class of neural latent variable models which we call Neural Processes (NPs), combining the best of both worlds. Like GPs, NPs define distributions over functions, are capable of rapid adaptation to new observations, and can estimate the uncertainty in their predictions. Like NNs, NPs are computationally efficient during training and evaluation but also learn to adapt their priors to data. We demonstrate the performance of NPs on a range of learning tasks, including regression and optimisation, and compare and contrast with related models in the literature.

READ FULL TEXT
research
07/04/2018

Conditional Neural Processes

Deep neural networks excel at function approximation, yet they are typic...
research
02/06/2014

Distributed Variational Inference in Sparse Gaussian Process Regression and Latent Variable Models

Gaussian processes (GPs) are a powerful tool for probabilistic inference...
research
09/12/2018

Learning Deep Mixtures of Gaussian Process Experts Using Sum-Product Networks

While Gaussian processes (GPs) are the method of choice for regression t...
research
08/29/2020

Modulating Scalable Gaussian Processes for Expressive Statistical Learning

For a learning task, Gaussian process (GP) is interested in learning the...
research
01/20/2021

A Similarity Measure of Gaussian Process Predictive Distributions

Some scenarios require the computation of a predictive distribution of a...
research
07/09/2021

Gaussian Process Subspace Regression for Model Reduction

Subspace-valued functions arise in a wide range of problems, including p...
research
06/19/2019

The Functional Neural Process

We present a new family of exchangeable stochastic processes, the Functi...

Please sign up or login with your details

Forgot password? Click here to reset