ATLAS: Universal Function Approximator for Memory Retention

08/10/2022
by   Heinrich van Deventer, et al.
4

Artificial neural networks (ANNs), despite their universal function approximation capability and practical success, are subject to catastrophic forgetting. Catastrophic forgetting refers to the abrupt unlearning of a previous task when a new task is learned. It is an emergent phenomenon that hinders continual learning. Existing universal function approximation theorems for ANNs guarantee function approximation ability, but do not predict catastrophic forgetting. This paper presents a novel universal approximation theorem for multi-variable functions using only single-variable functions and exponential functions. Furthermore, we present ATLAS: a novel ANN architecture based on the new theorem. It is shown that ATLAS is a universal function approximator capable of some memory retention, and continual learning. The memory of ATLAS is imperfect, with some off-target effects during continual learning, but it is well-behaved and predictable. An efficient implementation of ATLAS is provided. Experiments are conducted to evaluate both the function approximation and memory retention capabilities of ATLAS.

READ FULL TEXT

page 8

page 12

page 13

page 14

page 15

page 16

page 17

research
06/06/2019

Localizing Catastrophic Forgetting in Neural Networks

Artificial neural networks (ANNs) suffer from catastrophic forgetting wh...
research
05/12/2022

KASAM: Spline Additive Models for Function Approximation

Neural networks have been criticised for their inability to perform cont...
research
02/16/2019

Realizing Continual Learning through Modeling a Learning System as a Fiber Bundle

A human brain is capable of continual learning by nature; however the cu...
research
07/18/2023

HAT-CL: A Hard-Attention-to-the-Task PyTorch Library for Continual Learning

Catastrophic forgetting, the phenomenon in which a neural network loses ...
research
06/01/2023

Out-of-distribution forgetting: vulnerability of continual learning to intra-class distribution shift

Continual learning (CL) is an important technique to allow artificial ne...
research
01/13/2021

EEC: Learning to Encode and Regenerate Images for Continual Learning

The two main impediments to continual learning are catastrophic forgetti...
research
05/25/2023

SketchOGD: Memory-Efficient Continual Learning

When machine learning models are trained continually on a sequence of ta...

Please sign up or login with your details

Forgot password? Click here to reset