Stochastic Neural Network with Kronecker Flow

06/10/2019
by   Chin-Wei Huang, et al.
1

Recent advances in variational inference enable the modelling of highly structured joint distributions, but are limited in their capacity to scale to the high-dimensional setting of stochastic neural networks. This limitation motivates a need for scalable parameterizations of the noise generation process, in a manner that adequately captures the dependencies among the various parameters. In this work, we address this need and present the Kronecker Flow, a generalization of the Kronecker product to invertible mappings designed for stochastic neural networks. We apply our method to variational Bayesian neural networks on predictive tasks, PAC-Bayes generalization bound estimation, and approximate Thompson sampling in contextual bandits. In all setups, our methods prove to be competitive with existing methods and better than the baselines.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/13/2017

Bayesian Hypernetworks

We propose Bayesian hypernetworks: a framework for approximate Bayesian ...
research
03/14/2019

Functional Variational Bayesian Neural Networks

Variational Bayesian neural networks (BNNs) perform variational inferenc...
research
11/11/2022

Do Bayesian Neural Networks Need To Be Fully Stochastic?

We investigate the efficacy of treating all the parameters in a Bayesian...
research
10/09/2018

Fixing Variational Bayes: Deterministic Variational Inference for Bayesian Neural Networks

Bayesian neural networks (BNNs) hold great promise as a flexible and pri...
research
06/02/2022

Excess risk analysis for epistemic uncertainty with application to variational inference

We analyze the epistemic uncertainty (EU) of supervised learning in Baye...
research
06/22/2020

Differentiable PAC-Bayes Objectives with Partially Aggregated Neural Networks

We make three related contributions motivated by the challenge of traini...

Please sign up or login with your details

Forgot password? Click here to reset