The k-tied Normal Distribution: A Compact Parameterization of Gaussian Mean Field Posteriors in Bayesian Neural Networks

02/07/2020
by   Jakub Świątkowski, et al.
6

Variational Bayesian Inference is a popular methodology for approximating posterior distributions over Bayesian neural network weights. Recent work developing this class of methods has explored ever richer parameterizations of the approximate posterior in the hope of improving performance. In contrast, here we share a curious experimental finding that suggests instead restricting the variational distribution to a more compact parameterization. For a variety of deep Bayesian neural networks trained using Gaussian mean-field variational inference, we find that the posterior standard deviations consistently exhibit strong low-rank structure after convergence. This means that by decomposing these variational parameters into a low-rank factorization, we can make our variational approximation more compact without decreasing the models' performance. Furthermore, we find that such factorized parameterizations improve the signal-to-noise ratio of stochastic gradient estimates of the variational lower bound, resulting in faster convergence.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/13/2021

Wide Mean-Field Variational Bayesian Neural Networks Ignore the Data

Variational inference enables approximate posterior inference of the hig...
research
09/15/2022

On the detrimental effect of invariances in the likelihood for variational inference

Variational Bayesian posterior inference often requires simplifying appr...
research
02/17/2022

Sampling Approximately Low-Rank Ising Models: MCMC meets Variational Methods

We consider Ising models on the hypercube with a general interaction mat...
research
07/15/2023

Minimal Random Code Learning with Mean-KL Parameterization

This paper studies the qualitative behavior and robustness of two varian...
research
03/15/2021

Sampling-free Variational Inference for Neural Networks with Multiplicative Activation Noise

To adopt neural networks in safety critical domains, knowing whether we ...
research
02/10/2020

Try Depth Instead of Weight Correlations: Mean-field is a Less Restrictive Assumption for Deeper Networks

We challenge the longstanding assumption that the mean-field approximati...
research
02/23/2022

Wide Mean-Field Bayesian Neural Networks Ignore the Data

Bayesian neural networks (BNNs) combine the expressive power of deep lea...

Please sign up or login with your details

Forgot password? Click here to reset