Bootstrapping Neural Processes

08/07/2020
by   Juho Lee, et al.
1

Unlike in the traditional statistical modeling for which a user typically hand-specify a prior, Neural Processes (NPs) implicitly define a broad class of stochastic processes with neural networks. Given a data stream, NP learns a stochastic process that best describes the data. While this "data-driven" way of learning stochastic processes has proven to handle various types of data, NPs still relies on an assumption that uncertainty in stochastic processes is modeled by a single latent variable, which potentially limits the flexibility. To this end, we propose the Bootstrapping Neural Process (BNP), a novel extension of the NP family using the bootstrap. The bootstrap is a classical data-driven technique for estimating uncertainty, which allows BNP to learn the stochasticity in NPs without assuming a particular form. We demonstrate the efficacy of BNP on various types of data and its robustness in the presence of model-data mismatch.

READ FULL TEXT

page 7

page 19

page 20

research
04/19/2023

Martingale Posterior Neural Processes

A Neural Process (NP) estimates a stochastic process implicitly defined ...
research
08/21/2020

Doubly Stochastic Variational Inference for Neural Processes with Hierarchical Latent Variables

Neural processes (NPs) constitute a family of variational approximate mo...
research
10/28/2021

Multi-Task Processes

Neural Processes (NPs) consider a task as a function realized from a sto...
research
04/23/2021

Learning to reflect: A unifying approach for data-driven stochastic control strategies

Stochastic optimal control problems have a long tradition in applied pro...
research
08/23/2022

Latent Variable Models in the Era of Industrial Big Data: Extension and Beyond

A rich supply of data and innovative algorithms have made data-driven mo...

Please sign up or login with your details

Forgot password? Click here to reset