Sampling-Free Variational Inference of Bayesian Neural Nets

05/19/2018
by   Melih Kandemir, et al.
0

We propose a new Bayesian Neural Net (BNN) formulation that affords variational inference for which the evidence lower bound (ELBO) is analytically tractable subject to a tight approximation. We achieve this tractability by decomposing ReLU nonlinearities into an identity function and a Kronecker delta function. We demonstrate formally that assigning the outputs of these functions to separate latent variables allows representing the neural network likelihood as the composition of a chain of linear operations. Performing variational inference on this construction enables closed-form computation of the evidence lower bound. It can thus be maximized without requiring Monte Carlo sampling to approximate the problematic expected log-likelihood term. The resultant formulation boils down to stochastic gradient descent, where the gradients are not distorted by any factor besides minibatch selection. This amends a long-standing disadvantage of BNNs relative to deterministic nets. Experiments on four benchmark data sets show that the cleaner gradients provided by our construction yield a steeper learning curve, achieving higher prediction accuracies for a fixed epoch budget.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/19/2017

Importance Sampled Stochastic Optimization for Variational Inference

Variational inference approximates the posterior distribution of a proba...
research
06/27/2020

Stochastic Bayesian Neural Networks

Bayesian neural networks perform variational inference over weights but ...
research
02/06/2016

Rényi Divergence Variational Inference

This paper introduces the variational Rényi bound (VR) that extends trad...
research
06/11/2019

Approximate Variational Inference Based on a Finite Sample of Gaussian Latent Variables

Variational methods are employed in situations where exact Bayesian infe...
research
03/02/2020

Bayesian Neural Networks With Maximum Mean Discrepancy Regularization

Bayesian Neural Networks (BNNs) are trained to optimize an entire distri...
research
09/12/2018

Discretely Relaxing Continuous Variables for tractable Variational Inference

We explore a new research direction in Bayesian variational inference wi...
research
02/03/2014

Efficient Gradient-Based Inference through Transformations between Bayes Nets and Neural Nets

Hierarchical Bayesian networks and neural networks with stochastic hidde...

Please sign up or login with your details

Forgot password? Click here to reset