Parameters Estimation for the Cosmic Microwave Background with Bayesian Neural Networks

11/19/2019
by   Hector J. Hortua, et al.
0

In this paper, we present the first study that compares different models of Bayesian Neural Networks (BNNs) to predict the posterior distribution of the cosmological parameters directly from the Cosmic Microwave Background (CMB) map. We focus our analysis on four different methods to sample the weights of the network during training: Dropout, DropConnect, Reparameterization Trick (RT), and Flipout. We find that Flipout outperforms all other methods regardless of the architecture used, and provides tighter constraints for the cosmological parameters. Additionally, we describe existing strategies for calibrating the networks and propose new ones. We show how tuning the regularization parameter for the scale of the approximate posterior on the weights in Flipout and RT we can produce unbiased and reliable uncertainty estimates, i.e., the regularizer acts as a hyper parameter analogous to the dropout rate in Dropout. The best performances are nevertheless achieved with a more convenient method, in which the network is let free during training to achieve the best uncalibrated performances, and the confidence intervals are then calibrated in a subsequent phase. Furthermore, we claim that the correct calibration of these networks does not change the behavior for the epistemic and aleatoric uncertainties provided for BNNs when the training dataset size changes. The results reported in the paper can be extended to other cosmological datasets in order to estimate confidence regions for features that can be extracted directly from the raw data, such as non-Gaussianity signals or foreground emissions.

READ FULL TEXT
research
06/23/2019

Confidence Calibration for Convolutional Neural Networks Using Structured Dropout

In classification applications, we often want probabilistic predictions ...
research
09/27/2018

Dropout Distillation for Efficiently Estimating Model Confidence

We propose an efficient way to output better calibrated uncertainty scor...
research
05/14/2020

Constraining the Reionization History using Bayesian Normalizing Flows

The next generation 21 cm surveys open a new window onto the early stage...
research
02/03/2021

A Bayesian Neural Network based on Dropout Regulation

Bayesian Neural Networks (BNN) have recently emerged in the Deep Learnin...
research
01/27/2021

Bayesian Nested Neural Networks for Uncertainty Calibration and Adaptive Compression

Nested networks or slimmable networks are neural networks whose architec...
research
04/20/2021

A Bayesian Convolutional Neural Network for Robust Galaxy Ellipticity Regression

Cosmic shear estimation is an essential scientific goal for large galaxy...
research
11/19/2018

Variational Bayesian Dropout

Variational dropout (VD) is a generalization of Gaussian dropout, which ...

Please sign up or login with your details

Forgot password? Click here to reset