Weight Pruning and Uncertainty in Radio Galaxy Classification

11/23/2021
by   Devina Mohan, et al.
0

In this work we use variational inference to quantify the degree of epistemic uncertainty in model predictions of radio galaxy classification and show that the level of model posterior variance for individual test samples is correlated with human uncertainty when labelling radio galaxies. We explore the model performance and uncertainty calibration for a variety of different weight priors and suggest that a sparse prior produces more well-calibrated uncertainty estimates. Using the posterior distributions for individual weights, we show that signal-to-noise ratio (SNR) ranking allows pruning of the fully-connected layers to the level of 30% without significant loss of performance, and that this pruning increases the predictive uncertainty in the model. Finally we show that, like other work in this field, we experience a cold posterior effect. We examine whether adapting the cost function in our model to accommodate model misspecification can compensate for this effect, but find that it does not make a significant difference. We also examine the effect of principled data augmentation and find that it improves upon the baseline but does not compensate for the observed effect fully. We interpret this as the cold posterior effect being due to the overly effective curation of our training sample leading to likelihood misspecification, and raise this as a potential issue for Bayesian deep learning approaches to radio galaxy classification in future.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/30/2022

On Uncertainty, Tempering, and Data Augmentation in Bayesian Classification

Aleatoric uncertainty captures the inherent randomness of the data, such...
research
11/01/2020

On Signal-to-Noise Ratio Issues in Variational Inference for Deep Gaussian Processes

We show that the gradient estimates used in training Deep Gaussian Proce...
research
05/23/2019

Multi-Class Gaussian Process Classification Made Conjugate: Efficient Inference via Data Augmentation

We propose a new scalable multi-class Gaussian process classification ap...
research
12/02/2021

Why Calibration Error is Wrong Given Model Uncertainty: Using Posterior Predictive Checks with Deep Learning

Within the last few years, there has been a move towards using statistic...
research
02/12/2021

Bayesian Neural Network Priors Revisited

Isotropic Gaussian priors are the de facto standard for modern Bayesian ...
research
10/08/2021

Pathologies in priors and inference for Bayesian transformers

In recent years, the transformer has established itself as a workhorse i...
research
06/10/2021

Data augmentation in Bayesian neural networks and the cold posterior effect

Data augmentation is a highly effective approach for improving performan...

Please sign up or login with your details

Forgot password? Click here to reset