DeepAI AI Chat
Log In Sign Up

Continual Learning via Bit-Level Information Preserving

05/10/2021
by   Yujun Shi, et al.
0

Continual learning tackles the setting of learning different tasks sequentially. Despite the lots of previous solutions, most of them still suffer significant forgetting or expensive memory cost. In this work, targeted at these problems, we first study the continual learning process through the lens of information theory and observe that forgetting of a model stems from the loss of information gain on its parameters from the previous tasks when learning a new task. From this viewpoint, we then propose a novel continual learning approach called Bit-Level Information Preserving (BLIP) that preserves the information gain on model parameters through updating the parameters at the bit level, which can be conveniently implemented with parameter quantization. More specifically, BLIP first trains a neural network with weight quantization on the new incoming task and then estimates information gain on each parameter provided by the task data to determine the bits to be frozen to prevent forgetting. We conduct extensive experiments ranging from classification tasks to reinforcement learning tasks, and the results show that our method produces better or on par results comparing to previous state-of-the-arts. Indeed, BLIP achieves close to zero forgetting while only requiring constant memory overheads throughout continual learning.

READ FULL TEXT

page 8

page 14

05/31/2018

Reinforced Continual Learning

Most artificial intelligence models have limiting ability to solve new t...
07/09/2020

Graph-Based Continual Learning

Despite significant advances, continual learning models still suffer fro...
04/26/2022

Theoretical Understanding of the Information Flow on Continual Learning Performance

Continual learning (CL) is a setting in which an agent has to learn from...
12/19/2022

DSI++: Updating Transformer Memory with New Documents

Differentiable Search Indices (DSIs) encode a corpus of documents in the...
11/14/2022

Hierarchically Structured Task-Agnostic Continual Learning

One notable weakness of current machine learning algorithms is the poor ...
11/03/2022

Continual Learning of Neural Machine Translation within Low Forgetting Risk Regions

This paper considers continual learning of large-scale pretrained neural...

Code Repositories

BLIP

Official Implementation of CVPR2021 paper: Continual Learning via Bit-Level Information Preserving


view repo