A Bayesian Approach to Invariant Deep Neural Networks

07/20/2021
by   Nikolaos Mourdoukoutas, et al.
0

We propose a novel Bayesian neural network architecture that can learn invariances from data alone by inferring a posterior distribution over different weight-sharing schemes. We show that our model outperforms other non-invariant architectures, when trained on datasets that contain specific invariances. The same holds true when no data augmentation is performed.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/23/2022

Split personalities in Bayesian Neural Networks: the case for full marginalisation

The true posterior distribution of a Bayesian neural network is massivel...
research
02/02/2023

Neural Network Architecture for Database Augmentation Using Shared Features

The popularity of learning from data with machine learning and neural ne...
research
11/08/2018

Practical Bayesian Learning of Neural Networks via Adaptive Subgradient Methods

We introduce a novel framework for the estimation of the posterior distr...
research
06/02/2022

Masked Bayesian Neural Networks : Computation and Optimality

As data size and computing power increase, the architectures of deep neu...
research
06/10/2021

Data augmentation in Bayesian neural networks and the cold posterior effect

Data augmentation is a highly effective approach for improving performan...
research
04/02/2019

Correlated Parameters to Accurately Measure Uncertainty in Deep Neural Networks

In this article a novel approach for training deep neural networks using...
research
06/25/2021

Scene Uncertainty and the Wellington Posterior of Deterministic Image Classifiers

We propose a method to estimate the uncertainty of the outcome of an ima...

Please sign up or login with your details

Forgot password? Click here to reset