Novel Uncertainty Framework for Deep Learning Ensembles

04/09/2019
by   Tal Kachman, et al.
0

Deep neural networks have become the default choice for many of the machine learning tasks such as classification and regression. Dropout, a method commonly used to improve the convergence of deep neural networks, generates an ensemble of thinned networks with extensive weight sharing. Recent studies that dropout can be viewed as an approximate variational inference in Gaussian processes, and used as a practical tool to obtain uncertainty estimates of the network. We propose a novel statistical mechanics based framework to dropout and use this framework to propose a new generic algorithm that focuses on estimates of the variance of the loss as measured by the ensemble of thinned networks. Our approach can be applied to a wide range of deep neural network architectures and machine learning tasks. In classification, this algorithm allows the generation of a don't-know answer to be generated, which can increase the reliability of the classifier. Empirically we demonstrate state-of-the-art AUC results on publicly available benchmarks.

READ FULL TEXT
research
06/06/2015

Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning

Deep learning tools have gained tremendous attention in applied machine ...
research
01/07/2021

A Novel Regression Loss for Non-Parametric Uncertainty Optimization

Quantification of uncertainty is one of the most promising approaches to...
research
12/23/2020

Second-Moment Loss: A Novel Regression Objective for Improved Uncertainties

Quantification of uncertainty is one of the most promising approaches to...
research
04/09/2019

Exploring Uncertainty Measures for Image-Caption Embedding-and-Retrieval Task

With the wide development of black-box machine learning algorithms, part...
research
03/21/2019

Empirical confidence estimates for classification by deep neural networks

How well can we estimate the probability that the classification, C(f(x)...
research
06/12/2017

Confident Multiple Choice Learning

Ensemble methods are arguably the most trustworthy techniques for boosti...
research
01/21/2020

Variational Dropout Sparsification for Particle Identification speed-up

Accurate particle identification (PID) is one of the most important aspe...

Please sign up or login with your details

Forgot password? Click here to reset