Confident Neural Network Regression with Bootstrapped Deep Ensembles

02/22/2022
by   Laurens Sluijterman, et al.
0

With the rise of the popularity and usage of neural networks, trustworthy uncertainty estimation is becoming increasingly essential. In this paper we present a computationally cheap extension of Deep Ensembles for a regression setting called Bootstrapped Deep Ensembles that explicitly takes the effect of finite data into account using a modified version of the parametric bootstrap. We demonstrate through a simulation study that our method has comparable or better prediction intervals and superior confidence intervals compared to Deep Ensembles and other state-of-the-art methods. As an added bonus, our method is better capable of detecting overfitting than standard Deep Ensembles.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/10/2022

Layer Ensembles

Deep Ensembles, as a type of Bayesian Neural Networks, can be used to es...
research
10/08/2020

Prediction intervals for Deep Neural Networks

The aim of this paper is to propose a suitable method for constructing p...
research
07/09/2020

Fixed-time descriptive statistics underestimate extremes of epidemic curve ensembles

Across the world, scholars are racing to predict the spread of the novel...
research
09/24/2018

Deep Confidence: A Computationally Efficient Framework for Calculating Reliable Errors for Deep Neural Networks

Deep learning architectures have proved versatile in a number of drug di...
research
07/19/2020

Prediction Intervals: Split Normal Mixture from Quality-Driven Deep Ensembles

Prediction intervals are a machine- and human-interpretable way to repre...
research
04/17/2022

Multi-Model Ensemble Optimization

Methodology and optimization algorithms for sparse regression are extend...
research
06/16/2016

The Effect of Heteroscedasticity on Regression Trees

Regression trees are becoming increasingly popular as omnibus predicting...

Please sign up or login with your details

Forgot password? Click here to reset