Infinitely Deep Bayesian Neural Networks with Stochastic Differential Equations

02/12/2021
by   Winnie Xu, et al.
24

We perform scalable approximate inference in a recently-proposed family of continuous-depth Bayesian neural networks. In this model class, uncertainty about separate weights in each layer produces dynamics that follow a stochastic differential equation (SDE). We demonstrate gradient-based stochastic variational inference in this infinite-parameter setting, producing arbitrarily-flexible approximate posteriors. We also derive a novel gradient estimator that approaches zero variance as the approximate posterior approaches the true posterior. This approach further inherits the memory-efficient training and tunable precision of neural ODEs.

READ FULL TEXT

page 7

page 14

page 15

research
01/05/2020

Scalable Gradients for Stochastic Differential Equations

The adjoint sensitivity method scalably computes gradients of solutions ...
research
09/21/2022

Variational Inference for Infinitely Deep Neural Networks

We introduce the unbounded depth neural network (UDN), an infinitely dee...
research
12/23/2021

Latent Time Neural Ordinary Differential Equations

Neural ordinary differential equations (NODE) have been proposed as a co...
research
12/23/2021

Improving Robustness and Uncertainty Modelling in Neural Ordinary Differential Equations

Neural ordinary differential equations (NODE) have been proposed as a co...
research
02/15/2018

Constraining the Dynamics of Deep Probabilistic Models

We introduce a novel generative formulation of deep probabilistic models...
research
10/11/2021

Robust and Scalable SDE Learning: A Functional Perspective

Stochastic differential equations provide a rich class of flexible gener...
research
06/28/2023

Latent SDEs on Homogeneous Spaces

We consider the problem of variational Bayesian inference in a latent va...

Please sign up or login with your details

Forgot password? Click here to reset