Probabilistic Backpropagation for Scalable Learning of Bayesian Neural Networks

Large multilayer neural networks trained with backpropagation have recently achieved state-of-the-art results in a wide range of problems. However, using backprop for neural net learning still has some disadvantages, e.g., having to tune a large number of hyperparameters to the data, lack of calibrated probabilistic predictions, and a tendency to overfit the training data. In principle, the Bayesian approach to learning neural networks does not have these problems. However, existing Bayesian techniques lack scalability to large dataset and network sizes. In this work we present a novel scalable method for learning Bayesian neural networks, called probabilistic backpropagation (PBP). Similar to classical backpropagation, PBP works by computing a forward propagation of probabilities through the network and then doing a backward computation of gradients. A series of experiments on ten real-world datasets show that PBP is significantly faster than other techniques, while offering competitive predictive abilities. Our experiments also show that PBP provides accurate estimates of the posterior variance on the network weights.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/11/2015

Training Deep Gaussian Processes using Stochastic Expectation Propagation and Probabilistic Backpropagation

Deep Gaussian processes (DGPs) are multi-layer hierarchical generalisati...
research
11/02/2016

Natural-Parameter Networks: A Class of Probabilistic Neural Networks

Neural networks (NN) have achieved state-of-the-art performance in vario...
research
05/15/2017

Learning Probabilistic Programs Using Backpropagation

Probabilistic modeling enables combining domain knowledge with learning ...
research
10/07/2013

Mean Field Bayes Backpropagation: scalable training of multilayer neural networks with binary weights

Significant success has been reported recently using deep neural network...
research
07/05/2022

Meta-Learning a Real-Time Tabular AutoML Method For Small Data

We present TabPFN, an AutoML method that is competitive with the state o...
research
08/17/2023

TinyProp – Adaptive Sparse Backpropagation for Efficient TinyML On-device Learning

Training deep neural networks using backpropagation is very memory and c...
research
05/24/2019

Memorized Sparse Backpropagation

Neural network learning is typically slow since backpropagation needs to...

Please sign up or login with your details

Forgot password? Click here to reset