Structured Ensembles: an Approach to Reduce the Memory Footprint of Ensemble Methods

05/06/2021
by   Jary Pomponi, et al.
0

In this paper, we propose a novel ensembling technique for deep neural networks, which is able to drastically reduce the required memory compared to alternative approaches. In particular, we propose to extract multiple sub-networks from a single, untrained neural network by solving an end-to-end optimization task combining differentiable scaling over the original architecture, with multiple regularization terms favouring the diversity of the ensemble. Since our proposal aims to detect and extract sub-structures, we call it Structured Ensemble. On a large experimental evaluation, we show that our method can achieve higher or comparable accuracy to competing methods while requiring significantly less storage. In addition, we evaluate our ensembles in terms of predictive calibration and uncertainty, showing they compare favourably with the state-of-the-art. Finally, we draw a link with the continual learning literature, and we propose a modification of our framework to handle continuous streams of tasks with a sub-linear memory cost. We compare with a number of alternative strategies to mitigate catastrophic forgetting, highlighting advantages in terms of average accuracy and memory.

READ FULL TEXT
research
11/27/2022

Neural Architecture for Online Ensemble Continual Learning

Continual learning with an increasing number of classes is a challenging...
research
09/09/2019

Efficient Continual Learning in Neural Networks with Embedding Regularization

Continual learning of deep neural networks is a key requirement for scal...
research
11/02/2020

Modular-Relatedness for Continual Learning

In this paper, we propose a continual learning (CL) technique that is be...
research
10/28/2022

End-to-end Ensemble-based Feature Selection for Paralinguistics Tasks

The events of recent years have highlighted the importance of telemedici...
research
08/04/2020

Online Continual Learning under Extreme Memory Constraints

Continual Learning (CL) aims to develop agents emulating the human abili...
research
03/01/2018

Learning Sparse Structured Ensembles with SG-MCMC and Network Pruning

An ensemble of neural networks is known to be more robust and accurate t...
research
10/07/2021

Ensemble Neural Representation Networks

Implicit Neural Representation (INR) has recently attracted considerable...

Please sign up or login with your details

Forgot password? Click here to reset