Elastic Weight Consolidation (EWC): Nuts and Bolts

05/10/2021
by   Abhishek Aich, et al.
0

In this report, we present a theoretical support of the continual learning method Elastic Weight Consolidation, introduced in paper titled `Overcoming catastrophic forgetting in neural networks'. Being one of the most cited paper in regularized methods for continual learning, this report disentangles the underlying concept of the proposed objective function. We assume that the reader is aware of the basic terminologies of continual learning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/08/2023

A multifidelity approach to continual learning for physical systems

We introduce a novel continual learning method based on multifidelity de...
research
11/06/2018

Towards continual learning in medical imaging

This work investigates continual learning of two segmentation tasks in b...
research
01/13/2022

Technical Report for ICCV 2021 Challenge SSLAD-Track3B: Transformers Are Better Continual Learners

In the SSLAD-Track 3B challenge on continual learning, we propose the me...
research
02/17/2020

Residual Continual Learning

We propose a novel continual learning method called Residual Continual L...
research
11/02/2020

Modular-Relatedness for Continual Learning

In this paper, we propose a continual learning (CL) technique that is be...
research
04/22/2020

Continual Learning of Object Instances

We propose continual instance learning - a method that applies the conce...
research
01/31/2023

A Comprehensive Survey of Continual Learning: Theory, Method and Application

To cope with real-world dynamics, an intelligent agent needs to incremen...

Please sign up or login with your details

Forgot password? Click here to reset