Representation Memorization for Fast Learning New Knowledge without Forgetting

08/28/2021
by   Fei Mi, et al.
0

The ability to quickly learn new knowledge (e.g. new classes or data distributions) is a big step towards human-level intelligence. In this paper, we consider scenarios that require learning new classes or data distributions quickly and incrementally over time, as it often occurs in real-world dynamic environments. We propose "Memory-based Hebbian Parameter Adaptation" (Hebb) to tackle the two major challenges (i.e., catastrophic forgetting and sample efficiency) towards this goal in a unified framework. To mitigate catastrophic forgetting, Hebb augments a regular neural classifier with a continuously updated memory module to store representations of previous data. To improve sample efficiency, we propose a parameter adaptation method based on the well-known Hebbian theory, which directly "wires" the output network's parameters with similar representations retrieved from the memory. We empirically verify the superior performance of Hebb through extensive experiments on a wide range of learning tasks (image classification, language model) and learning scenarios (continual, incremental, online). We demonstrate that Hebb effectively mitigates catastrophic forgetting, and it indeed learns new knowledge better and faster than the current state-of-the-art.

READ FULL TEXT
research
06/21/2023

Complementary Learning Subnetworks for Parameter-Efficient Class-Incremental Learning

In the scenario of class-incremental learning (CIL), deep neural network...
research
02/28/2018

Memory-based Parameter Adaptation

Deep neural networks have excelled on a wide range of problems, from vis...
research
10/06/2020

Efficient Meta Lifelong-Learning with Limited Memory

Current natural language processing models work well on a single task, y...
research
11/11/2021

Lifelong Learning from Event-based Data

Lifelong learning is a long-standing aim for artificial agents that act ...
research
12/01/2022

GMM-IL: Image Classification using Incrementally Learnt, Independent Probabilistic Models for Small Sample Sizes

Current deep learning classifiers, carry out supervised learning and sto...
research
10/24/2019

Adversarial Feature Alignment: Avoid Catastrophic Forgetting in Incremental Task Lifelong Learning

Human beings are able to master a variety of knowledge and skills with o...
research
01/12/2023

Online Class-Incremental Learning For Real-World Food Classification

Online Class-Incremental Learning (OCIL) aims to continuously learn new ...

Please sign up or login with your details

Forgot password? Click here to reset