
Bayesian Neural Networks: Essentials
Bayesian neural networks utilize probabilistic layers that capture uncer...
read it

Functional Variational Bayesian Neural Networks
Variational Bayesian neural networks (BNNs) perform variational inferenc...
read it

Hierarchical Gaussian Process Priors for Bayesian Neural Network Weights
Probabilistic neural networks are typically modeled with independent wei...
read it

Multimodal Deep Gaussian Processes
We propose a novel Bayesian approach to modelling multimodal data genera...
read it

Functional Space Variational Inference for Uncertainty Estimation in Computer Aided Diagnosis
Deep neural networks have revolutionized medical image analysis and dise...
read it

Bayesian Layers: A Module for Neural Network Uncertainty
We describe Bayesian Layers, a module designed for fast experimentation ...
read it

OutputConstrained Bayesian Neural Networks
Bayesian neural network (BNN) priors are defined in parameter space, mak...
read it
Hybrid Bayesian Neural Networks with Functional Probabilistic Layers
Bayesian neural networks provide a direct and natural way to extend standard deep neural networks to support probabilistic deep learning through the use of probabilistic layers that, traditionally, encode weight (and bias) uncertainty. In particular, hybrid Bayesian neural networks utilize standard deterministic layers together with few probabilistic layers judicially positioned in the networks for uncertainty estimation. A major aspect and benefit of Bayesian inference is that priors, in principle, provide the means to encode prior knowledge for use in inference and prediction. However, it is difficult to specify priors on weights since the weights have no intuitive interpretation. Further, the relationships of priors on weights to the functions computed by networks are difficult to characterize. In contrast, functions are intuitive to interpret and are direct since they map inputs to outputs. Therefore, it is natural to specify priors on functions to encode prior knowledge, and to use them in inference and prediction based on functions. To support this, we propose hybrid Bayesian neural networks with functional probabilistic layers that encode function (and activation) uncertainty. We discuss their foundations in functional Bayesian inference, functional variational inference, sparse Gaussian processes, and sparse variational Gaussian processes. We further perform few proofofconcept experiments using GPflus, a new library that provides Gaussian process layers and supports their use with deterministic Keras layers to form hybrid neural network and Gaussian process models.
READ FULL TEXT
Comments
There are no comments yet.