Stabilising priors for robust Bayesian deep learning

10/23/2019
by   Felix McGregor, et al.
0

Bayesian neural networks (BNNs) have developed into useful tools for probabilistic modelling due to recent advances in variational inference enabling large scale BNNs. However, BNNs remain brittle and hard to train, especially: (1) when using deep architectures consisting of many hidden layers and (2) in situations with large weight variances. We use signal propagation theory to quantify these challenges and propose self-stabilising priors. This is achieved by a reformulation of the ELBO to allow the prior to influence network signal propagation. Then, we develop a stabilising prior, where the distributional parameters of the prior are adjusted before each forward pass to ensure stability of the propagating signal. This stabilised signal propagation leads to improved convergence and robustness making it possible to train deeper networks and in more noisy settings.

READ FULL TEXT
research
06/12/2019

MOPED: Efficient priors for scalable variational inference in Bayesian deep neural networks

Variational inference for Bayesian deep neural networks (DNNs) requires ...
research
05/14/2021

Priors in Bayesian Deep Learning: A Review

While the choice of prior is one of the most critical parts of the Bayes...
research
11/10/2021

Robust Learning via Ensemble Density Propagation in Deep Neural Networks

Learning in uncertain, noisy, or adversarial environments is a challengi...
research
05/28/2022

Rethinking Bayesian Learning for Data Analysis: The Art of Prior and Inference in Sparsity-Aware Modeling

Sparse modeling for signal processing and machine learning has been at t...
research
02/27/2023

Signal Propagation in Double Edged Relays

A discrete signal propagation model blending characteristics of linear w...
research
05/20/2016

Variational hybridization and transformation for large inaccurate noisy-or networks

Variational inference provides approximations to the computationally int...
research
09/07/2023

Prime and Modulate Learning: Generation of forward models with signed back-propagation and environmental cues

Deep neural networks employing error back-propagation for learning can s...

Please sign up or login with your details

Forgot password? Click here to reset