DeepAI AI Chat
Log In Sign Up

Efficient Continual Learning Ensembles in Neural Network Subspaces

02/20/2022
by   Thang Doan, et al.
5

A growing body of research in continual learning focuses on the catastrophic forgetting problem. While many attempts have been made to alleviate this problem, the majority of the methods assume a single model in the continual learning setup. In this work, we question this assumption and show that employing ensemble models can be a simple yet effective method to improve continual performance. However, the training and inference cost of ensembles can increase linearly with the number of models. Motivated by this limitation, we leverage the recent advances in the deep learning optimization literature, such as mode connectivity and neural network subspaces, to derive a new method that is both computationally advantageous and can outperform the state-of-the-art continual learning algorithms.

READ FULL TEXT

page 2

page 8

page 26

10/29/2017

Variational Continual Learning

This paper develops variational continual learning (VCL), a simple but g...
10/09/2020

Linear Mode Connectivity in Multitask and Continual Learning

Continual (sequential) training and multitask (simultaneous) training ar...
11/27/2022

Neural Architecture for Online Ensemble Continual Learning

Continual learning with an increasing number of classes is a challenging...
03/27/2023

CoDeC: Communication-Efficient Decentralized Continual Learning

Training at the edge utilizes continuously evolving data generated at di...
07/11/2022

Repairing Neural Networks by Leaving the Right Past Behind

Prediction failures of machine learning models often arise from deficien...
05/19/2022

Interpolating Compressed Parameter Subspaces

Inspired by recent work on neural subspaces and mode connectivity, we re...