GP-ConvCNP: Better Generalization for Convolutional Conditional Neural Processes on Time Series Data

06/09/2021
by   Jens Petersen, et al.
0

Neural Processes (NPs) are a family of conditional generative models that are able to model a distribution over functions, in a way that allows them to perform predictions at test time conditioned on a number of context points. A recent addition to this family, Convolutional Conditional Neural Processes (ConvCNP), have shown remarkable improvement in performance over prior art, but we find that they sometimes struggle to generalize when applied to time series data. In particular, they are not robust to distribution shifts and fail to extrapolate observed patterns into the future. By incorporating a Gaussian Process into the model, we are able to remedy this and at the same time improve performance within distribution. As an added benefit, the Gaussian Process reintroduces the possibility to sample from the model, a key feature of other members in the NP family.

READ FULL TEXT

page 4

page 6

page 10

page 12

page 13

page 14

page 15

page 17

06/13/2019

Recurrent Neural Processes

We extend Neural Processes (NPs) to sequential data through Recurrent NP...
11/25/2020

Equivariant Conditional Neural Processes

We introduce Equivariant Conditional Neural Processes (EquivCNPs), a new...
03/13/2020

The Elliptical Processes: a New Family of Flexible Stochastic Processes

We present the elliptical processes-a new family of stochastic processes...
01/17/2019

Attentive Neural Processes

Neural Processes (NPs) (Garnelo et al 2018a;b) approach regression by le...
10/29/2019

Convolutional Conditional Neural Processes

We introduce the Convolutional Conditional Neural Process (ConvCNP), a n...
10/09/2012

Gaussian process modelling of multiple short time series

We present techniques for effective Gaussian process (GP) modelling of m...
05/22/2022

Deep Discriminative Direct Decoders for High-dimensional Time-series Analysis

Dynamical latent variable modeling has been significantly invested over ...

Code Repositories