On Power Laws in Deep Ensembles

07/16/2020
by   Ekaterina Lobacheva, et al.
0

Ensembles of deep neural networks are known to achieve state-of-the-art performance in uncertainty estimation and lead to accuracy improvement. In this work, we focus on a classification problem and investigate the behavior of both non-calibrated and calibrated negative log-likelihood (CNLL) of a deep ensemble as a function of the ensemble size and the member network size. We indicate the conditions under which CNLL follows a power law w.r.t. ensemble size or member network size, and analyze the dynamics of the parameters of the discovered power laws. Our important practical finding is that one large network may perform worse than an ensemble of several medium-size networks with the same total number of parameters (we call this ensemble a memory split). Using the detected power law-like dependencies, we can predict (1) the possible gain from the ensembling of networks with given structure, (2) the optimal memory split given a memory budget, based on a relatively small number of trained networks. We describe the memory split advantage effect in more details in arXiv:2005.07292

READ FULL TEXT
research
05/14/2020

Deep Ensembles on a Fixed Memory Budget: One Wide Network or Several Thinner Ones?

One of the generally accepted views of modern deep learning is that incr...
research
02/17/2020

BatchEnsemble: an Alternative Approach to Efficient Ensemble and Lifelong Learning

Ensembles, where multiple neural networks are trained individually and t...
research
10/30/2022

A Solvable Model of Neural Scaling Laws

Large language models with a huge number of parameters, when trained on ...
research
09/18/2017

Coupled Ensembles of Neural Networks

We investigate in this paper the architecture of deep convolutional netw...
research
11/29/2021

On the Effectiveness of Neural Ensembles for Image Classification with Small Datasets

Deep neural networks represent the gold standard for image classificatio...
research
05/29/2018

Bayesian Inference with Anchored Ensembles of Neural Networks, and Application to Reinforcement Learning

The use of ensembles of neural networks (NNs) for the quantification of ...
research
10/28/2022

End-to-end Ensemble-based Feature Selection for Paralinguistics Tasks

The events of recent years have highlighted the importance of telemedici...

Please sign up or login with your details

Forgot password? Click here to reset