Out-of-distribution forgetting: vulnerability of continual learning to intra-class distribution shift

06/01/2023
by   Liangxuan Guo, et al.
0

Continual learning (CL) is an important technique to allow artificial neural networks to work in open environments. CL enables a system to learn new tasks without severe interference to its performance on old tasks, i.e., overcome the problems of catastrophic forgetting. In joint learning, it is well known that the out-of-distribution (OOD) problem caused by intentional attacks or environmental perturbations will severely impair the ability of networks to generalize. In this work, we reported a special form of catastrophic forgetting raised by the OOD problem in continual learning settings, and we named it out-of-distribution forgetting (OODF). In continual image classification tasks, we found that for a given category, introducing an intra-class distribution shift significantly impaired the recognition accuracy of CL methods for that category during subsequent learning. Interestingly, this phenomenon is special for CL as the same level of distribution shift had only negligible effects in the joint learning scenario. We verified that CL methods without dedicating subnetworks for individual tasks are all vulnerable to OODF. Moreover, OODF does not depend on any specific way of shifting the distribution, suggesting it is a risk for CL in a wide range of circumstances. Taken together, our work identified an under-attended risk during CL, highlighting the importance of developing approaches that can overcome OODF.

READ FULL TEXT
research
06/06/2019

Localizing Catastrophic Forgetting in Neural Networks

Artificial neural networks (ANNs) suffer from catastrophic forgetting wh...
research
06/11/2018

Meta Continual Learning

Using neural networks in practical settings would benefit from the abili...
research
11/21/2022

On the Robustness, Generalization, and Forgetting of Shape-Texture Debiased Continual Learning

Tremendous progress has been made in continual learning to maintain good...
research
08/10/2022

ATLAS: Universal Function Approximator for Memory Retention

Artificial neural networks (ANNs), despite their universal function appr...
research
03/13/2017

Continual Learning Through Synaptic Intelligence

While deep learning has led to remarkable advances across diverse applic...
research
05/18/2022

Maslow's Hammer for Catastrophic Forgetting: Node Re-Use vs Node Activation

Continual learning - learning new tasks in sequence while maintaining pe...
research
01/16/2020

Continual Learning for Domain Adaptation in Chest X-ray Classification

Over the last years, Deep Learning has been successfully applied to a br...

Please sign up or login with your details

Forgot password? Click here to reset