DeepAI AI Chat
Log In Sign Up

Radial and Directional Posteriors for Bayesian Neural Networks

02/07/2019
by   ChangYong Oh, et al.
0

We propose a new variational family for Bayesian neural networks. We decompose the variational posterior into two components, where the radial component captures the strength of each neuron in terms of its magnitude; while the directional component captures the statistical dependencies among the weight parameters. The dependencies learned via the directional density provide better modeling performance compared to the widely-used Gaussian mean-field-type variational family. In addition, the strength of input and output neurons learned via the radial density provides a structured way to compress neural networks. Indeed, experiments show that our variational family improves predictive performance and yields compressed networks simultaneously.

READ FULL TEXT

page 1

page 2

page 3

page 4

07/01/2019

Radial Bayesian Neural Networks: Robust Variational Inference In Big Models

We propose Radial Bayesian Neural Networks: a variational distribution f...
02/13/2023

Variational Bayesian Neural Networks via Resolution of Singularities

In this work, we advocate for the importance of singular learning theory...
03/06/2017

Multiplicative Normalizing Flows for Variational Bayesian Neural Networks

We reinterpret multiplicative noise in neural networks as auxiliary rand...
07/06/2021

The QR decomposition for radial neural networks

We provide a theoretical framework for neural networks in terms of the r...
02/12/2019

Gaussian Mean Field Regularizes by Limiting Learned Information

Variational inference with a factorized Gaussian posterior estimate is a...
06/13/2018

Structured Variational Learning of Bayesian Neural Networks with Horseshoe Priors

Bayesian Neural Networks (BNNs) have recently received increasing attent...
03/30/2020

The Pade Approximant Based Network for Variational Problems

In solving the variational problem, the key is to efficiently find the t...