Bayesian Deep Learning Hyperparameter Search for Robust Function Mapping to Polynomials with Noise

06/23/2021
by   Nidhin Harilal, et al.
13

Advances in neural architecture search, as well as explainability and interpretability of connectionist architectures, have been reported in the recent literature. However, our understanding of how to design Bayesian Deep Learning (BDL) hyperparameters, specifically, the depth, width and ensemble size, for robust function mapping with uncertainty quantification, is still emerging. This paper attempts to further our understanding by mapping Bayesian connectionist representations to polynomials of different orders with varying noise types and ratios. We examine the noise-contaminated polynomials to search for the combination of hyperparameters that can extract the underlying polynomial signals while quantifying uncertainties based on the noise attributes. Specifically, we attempt to study the question that an appropriate neural architecture and ensemble configuration can be found to detect a signal of any n-th order polynomial contaminated with noise having different distributions and signal-to-noise (SNR) ratios and varying noise attributes. Our results suggest the possible existence of an optimal network depth as well as an optimal number of ensembles for prediction skills and uncertainty quantification, respectively. However, optimality is not discernible for width, even though the performance gain reduces with increasing width at high values of width. Our experiments and insights can be directional to understand theoretical properties of BDL representations and to design practical solutions.

READ FULL TEXT

page 4

page 5

research
02/20/2023

Quantifying uncertainty for deep learning based forecasting and flow-reconstruction using neural architecture search ensembles

Classical problems in computational physics such as data-driven forecast...
research
07/21/2021

Ensemble-based Uncertainty Quantification: Bayesian versus Credal Inference

The idea to distinguish and quantify two important types of uncertainty,...
research
10/08/2022

Unified Probabilistic Neural Architecture and Weight Ensembling Improves Model Robustness

Robust machine learning models with accurately calibrated uncertainties ...
research
02/09/2022

Model Architecture Adaption for Bayesian Neural Networks

Bayesian Neural Networks (BNNs) offer a mathematically grounded framewor...
research
10/01/2019

Sub-Architecture Ensemble Pruning in Neural Architecture Search

Neural architecture search (NAS) is gaining more and more attention in r...
research
05/30/2022

Uncertainty Quantification and Resource-Demanding Computer Vision Applications of Deep Learning

Bringing deep neural networks (DNNs) into safety critical applications s...
research
06/13/2020

Collegial Ensembles

Modern neural network performance typically improves as model size incre...

Please sign up or login with your details

Forgot password? Click here to reset