Continual Learning with Deep Learning Methods in an Application-Oriented Context

07/12/2022
by   Benedikt Pfülb, et al.
11

Abstract knowledge is deeply grounded in many computer-based applications. An important research area of Artificial Intelligence (AI) deals with the automatic derivation of knowledge from data. Machine learning offers the according algorithms. One area of research focuses on the development of biologically inspired learning algorithms. The respective machine learning methods are based on neurological concepts so that they can systematically derive knowledge from data and store it. One type of machine learning algorithms that can be categorized as "deep learning" model is referred to as Deep Neural Networks (DNNs). DNNs consist of multiple artificial neurons arranged in layers that are trained by using the backpropagation algorithm. These deep learning methods exhibit amazing capabilities for inferring and storing complex knowledge from high-dimensional data. However, DNNs are affected by a problem that prevents new knowledge from being added to an existing base. The ability to continuously accumulate knowledge is an important factor that contributed to evolution and is therefore a prerequisite for the development of strong AIs. The so-called "catastrophic forgetting" (CF) effect causes DNNs to immediately loose already derived knowledge after a few training iterations on a new data distribution. Only an energetically expensive retraining with the joint data distribution of past and new data enables the abstraction of the entire new set of knowledge. In order to counteract the effect, various techniques have been and are still being developed with the goal to mitigate or even solve the CF problem. These published CF avoidance studies usually imply the effectiveness of their approaches for various continual learning tasks. This dissertation is set in the context of continual machine learning with deep learning methods. The first part deals with the development of an ...

READ FULL TEXT

page 11

page 12

page 13

page 14

page 26

page 31

page 35

research
02/19/2021

Condensed Composite Memory Continual Learning

Deep Neural Networks (DNNs) suffer from a rapid decrease in performance ...
research
09/15/2023

Continual Learning with Deep Streaming Regularized Discriminant Analysis

Continual learning is increasingly sought after in real world machine le...
research
12/22/2021

Continual learning of longitudinal health records

Continual learning denotes machine learning methods which can adapt to n...
research
08/09/2023

Enhancing Efficient Continual Learning with Dynamic Structure Development of Spiking Neural Networks

Children possess the ability to learn multiple cognitive tasks sequentia...
research
08/30/2022

Beyond Supervised Continual Learning: a Review

Continual Learning (CL, sometimes also termed incremental learning) is a...
research
07/11/2022

Synergy and Symmetry in Deep Learning: Interactions between the Data, Model, and Inference Algorithm

Although learning in high dimensions is commonly believed to suffer from...
research
07/29/2021

Few-Shot and Continual Learning with Attentive Independent Mechanisms

Deep neural networks (DNNs) are known to perform well when deployed to t...

Please sign up or login with your details

Forgot password? Click here to reset