DeepAI
Log In Sign Up

Bootstrapping Neural Processes

08/07/2020
by   Juho Lee, et al.
1

Unlike in the traditional statistical modeling for which a user typically hand-specify a prior, Neural Processes (NPs) implicitly define a broad class of stochastic processes with neural networks. Given a data stream, NP learns a stochastic process that best describes the data. While this "data-driven" way of learning stochastic processes has proven to handle various types of data, NPs still relies on an assumption that uncertainty in stochastic processes is modeled by a single latent variable, which potentially limits the flexibility. To this end, we propose the Bootstrapping Neural Process (BNP), a novel extension of the NP family using the bootstrap. The bootstrap is a classical data-driven technique for estimating uncertainty, which allows BNP to learn the stochasticity in NPs without assuming a particular form. We demonstrate the efficacy of BNP on various types of data and its robustness in the presence of model-data mismatch.

READ FULL TEXT

page 7

page 19

page 20

08/21/2020

Doubly Stochastic Variational Inference for Neural Processes with Hierarchical Latent Variables

Neural processes (NPs) constitute a family of variational approximate mo...
02/17/2022

Variational Neural Temporal Point Process

A temporal point process is a stochastic process that predicts which typ...
10/28/2021

Multi-Task Processes

Neural Processes (NPs) consider a task as a function realized from a sto...
04/23/2021

Learning to reflect: A unifying approach for data-driven stochastic control strategies

Stochastic optimal control problems have a long tradition in applied pro...
08/23/2022

Latent Variable Models in the Era of Industrial Big Data: Extension and Beyond

A rich supply of data and innovative algorithms have made data-driven mo...
07/01/2022

A Stochastic Contraction Mapping Theorem

In this paper we define contractive and nonexpansive properties for adap...