DeepAI AI Chat
Log In Sign Up

Continual Learning with Neuron Activation Importance

07/27/2021
by   Sohee Kim, et al.
Kyung Hee University
0

Continual learning is a concept of online learning with multiple sequential tasks. One of the critical barriers of continual learning is that a network should learn a new task keeping the knowledge of old tasks without access to any data of the old tasks. In this paper, we propose a neuron activation importance-based regularization method for stable continual learning regardless of the order of tasks. We conduct comprehensive experiments on existing benchmark data sets to evaluate not just the stability and plasticity of our method with improved classification accuracy also the robustness of the performance along the changes of task order.

READ FULL TEXT

page 1

page 2

page 3

page 4

06/26/2021

Continual Learning via Inter-Task Synaptic Mapping

Learning from streaming tasks leads a model to catastrophically erase un...
04/12/2021

Continual Learning for Text Classification with Information Disentanglement Based Regularization

Continual learning has become increasingly important as it enables NLP m...
01/05/2022

Hyperparameter-free Continuous Learning for Domain Classification in Natural Language Understanding

Domain classification is the fundamental task in natural language unders...
06/09/2020

Variational Auto-Regressive Gaussian Processes for Continual Learning

This paper proposes Variational Auto-Regressive Gaussian Process (VAR-GP...
12/18/2018

Continual Match Based Training in Pommerman: Technical Report

Continual learning is the ability of agents to improve their capacities ...
03/13/2021

Online Learning of Objects through Curiosity-Driven Active Learning

Children learn continually by asking questions about the concepts they a...
11/15/2021

Target Layer Regularization for Continual Learning Using Cramer-Wold Generator

We propose an effective regularization strategy (CW-TaLaR) for solving c...