Joint Training of Deep Ensembles Fails Due to Learner Collusion

01/26/2023
by   Alan Jeffares, et al.
3

Ensembles of machine learning models have been well established as a powerful method of improving performance over a single model. Traditionally, ensembling algorithms train their base learners independently or sequentially with the goal of optimizing their joint performance. In the case of deep ensembles of neural networks, we are provided with the opportunity to directly optimize the true objective: the joint performance of the ensemble as a whole. Surprisingly, however, directly minimizing the loss of the ensemble appears to rarely be applied in practice. Instead, most previous research trains individual models independently with ensembling performed post hoc. In this work, we show that this is for good reason - joint optimization of ensemble loss results in degenerate behavior. We approach this problem by decomposing the ensemble objective into the strength of the base learners and the diversity between them. We discover that joint optimization results in a phenomenon in which base learners collude to artificially inflate their apparent diversity. This pseudo-diversity fails to generalize beyond the training data, causing a larger generalization gap. We proceed to demonstrate the practical implications of this effect finding that, in some cases, a balance between independent training and joint optimization can improve performance over the former while avoiding the degeneracies of the latter.

READ FULL TEXT
research
09/29/2021

Neural Network Ensembles: Theory, Training, and the Importance of Explicit Diversity

Ensemble learning is a process by which multiple base learners are strat...
research
11/19/2015

Why M Heads are Better than One: Training a Diverse Ensemble of Deep Networks

Convolutional Neural Networks have achieved state-of-the-art performance...
research
02/12/2019

Joint Training of Neural Network Ensembles

We examine the practice of joint training for neural network ensembles, ...
research
03/06/2018

Deep Super Learner: A Deep Ensemble for Classification Problems

Deep learning has become very popular for tasks such as predictive model...
research
03/24/2016

Deep Extreme Feature Extraction: New MVA Method for Searching Particles in High Energy Physics

In this paper, we present Deep Extreme Feature Extraction (DEFE), a new ...
research
03/21/2023

A Random Projection k Nearest Neighbours Ensemble for Classification via Extended Neighbourhood Rule

Ensembles based on k nearest neighbours (kNN) combine a large number of ...
research
05/24/2019

EnsembleNet: End-to-End Optimization of Multi-headed Models

Ensembling is a universally useful approach to boost the performance of ...

Please sign up or login with your details

Forgot password? Click here to reset