IF2Net: Innately Forgetting-Free Networks for Continual Learning

06/18/2023
by   Depeng Li, et al.
0

Continual learning can incrementally absorb new concepts without interfering with previously learned knowledge. Motivated by the characteristics of neural networks, in which information is stored in weights on connections, we investigated how to design an Innately Forgetting-Free Network (IF2Net) for continual learning context. This study proposed a straightforward yet effective learning paradigm by ingeniously keeping the weights relative to each seen task untouched before and after learning a new task. We first presented the novel representation-level learning on task sequences with random weights. This technique refers to tweaking the drifted representations caused by randomization back to their separate task-optimal working states, but the involved weights are frozen and reused (opposite to well-known layer-wise updates of weights). Then, sequential decision-making without forgetting can be achieved by projecting the output weight updates into the parsimonious orthogonal space, making the adaptations not disturb old knowledge while maintaining model plasticity. IF2Net allows a single network to inherently learn unlimited mapping rules without telling task identities at test time by integrating the respective strengths of randomization and orthogonalization. We validated the effectiveness of our approach in the extensive theoretical analysis and empirical study.

READ FULL TEXT

page 7

page 12

research
03/24/2022

Probing Representation Forgetting in Supervised and Unsupervised Continual Learning

Continual Learning research typically focuses on tackling the phenomenon...
research
04/19/2021

Few-shot Continual Learning: a Brain-inspired Approach

It is an important yet challenging setting to continually learn new task...
research
05/29/2019

Meta-Learning Representations for Continual Learning

A continual learning agent should be able to build on top of existing kn...
research
04/03/2023

Knowledge Accumulation in Continually Learned Representations and the Issue of Feature Forgetting

By default, neural networks learn on all training data at once. When suc...
research
09/16/2022

Continual Learning with Dependency Preserving Hypernetworks

Humans learn continually throughout their lifespan by accumulating diver...
research
06/16/2023

Multi-View Class Incremental Learning

Multi-view learning (MVL) has gained great success in integrating inform...
research
03/27/2023

CoDeC: Communication-Efficient Decentralized Continual Learning

Training at the edge utilizes continuously evolving data generated at di...

Please sign up or login with your details

Forgot password? Click here to reset